Raspbian Package Auto-Building

Build log for consul (1.5.2+dfsg1-6) on armhf

consul1.5.2+dfsg1-6armhf → 2019-12-06 06:44:35

sbuild (Debian sbuild) 0.71.0 (24 Aug 2016) on bm-wb-01

+==============================================================================+
| consul 1.5.2+dfsg1-6 (armhf)                 Fri, 06 Dec 2019 05:09:18 +0000 |
+==============================================================================+

Package: consul
Version: 1.5.2+dfsg1-6
Source Version: 1.5.2+dfsg1-6
Distribution: bullseye-staging
Machine Architecture: armhf
Host Architecture: armhf
Build Architecture: armhf

I: NOTICE: Log filtering will replace 'var/run/schroot/mount/bullseye-staging-armhf-sbuild-8476812f-5823-4527-9126-cc5f34c49e67' with '<<CHROOT>>'

+------------------------------------------------------------------------------+
| Update chroot                                                                |
+------------------------------------------------------------------------------+

Get:1 http://172.17.0.1/private bullseye-staging InRelease [11.3 kB]
Get:2 http://172.17.0.1/private bullseye-staging/main Sources [11.5 MB]
Get:3 http://172.17.0.1/private bullseye-staging/main armhf Packages [12.8 MB]
Fetched 24.3 MB in 28s (875 kB/s)
Reading package lists...

+------------------------------------------------------------------------------+
| Fetch source files                                                           |
+------------------------------------------------------------------------------+


Check APT
---------

Checking available source versions...

Download source files with APT
------------------------------

Reading package lists...
NOTICE: 'consul' packaging is maintained in the 'Git' version control system at:
https://salsa.debian.org/go-team/packages/consul.git
Please use:
git clone https://salsa.debian.org/go-team/packages/consul.git
to retrieve the latest (possibly unreleased) updates to the package.
Need to get 5409 kB of source archives.
Get:1 http://172.17.0.1/private bullseye-staging/main consul 1.5.2+dfsg1-6 (dsc) [5436 B]
Get:2 http://172.17.0.1/private bullseye-staging/main consul 1.5.2+dfsg1-6 (tar) [5383 kB]
Get:3 http://172.17.0.1/private bullseye-staging/main consul 1.5.2+dfsg1-6 (diff) [21.0 kB]
Fetched 5409 kB in 1s (6785 kB/s)
Download complete and in download only mode
I: NOTICE: Log filtering will replace 'build/consul-Ct56EX/consul-1.5.2+dfsg1' with '<<PKGBUILDDIR>>'
I: NOTICE: Log filtering will replace 'build/consul-Ct56EX' with '<<BUILDDIR>>'

+------------------------------------------------------------------------------+
| Install build-essential                                                      |
+------------------------------------------------------------------------------+


Setup apt archive
-----------------

Merged Build-Depends: build-essential, fakeroot
Filtered Build-Depends: build-essential, fakeroot
dpkg-deb: building package 'sbuild-build-depends-core-dummy' in '/<<BUILDDIR>>/resolver-mFPm5i/apt_archive/sbuild-build-depends-core-dummy.deb'.
dpkg-scanpackages: warning: Packages in archive but missing from override file:
dpkg-scanpackages: warning:   sbuild-build-depends-core-dummy
dpkg-scanpackages: info: Wrote 1 entries to output Packages file.
gpg: keybox '/<<BUILDDIR>>/resolver-mFPm5i/gpg/pubring.kbx' created
gpg: /<<BUILDDIR>>/resolver-mFPm5i/gpg/trustdb.gpg: trustdb created
gpg: key 35506D9A48F77B2E: public key "Sbuild Signer (Sbuild Build Dependency Archive Key) <buildd-tools-devel@lists.alioth.debian.org>" imported
gpg: Total number processed: 1
gpg:               imported: 1
gpg: key 35506D9A48F77B2E: "Sbuild Signer (Sbuild Build Dependency Archive Key) <buildd-tools-devel@lists.alioth.debian.org>" not changed
gpg: key 35506D9A48F77B2E: secret key imported
gpg: Total number processed: 1
gpg:              unchanged: 1
gpg:       secret keys read: 1
gpg:   secret keys imported: 1
gpg: using "Sbuild Signer" as default secret key for signing
Ign:1 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ InRelease
Get:2 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Release [957 B]
Get:3 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Release.gpg [370 B]
Ign:3 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Release.gpg
Get:4 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Sources [349 B]
Get:5 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Packages [432 B]
Fetched 2108 B in 1s (2397 B/s)
Reading package lists...
W: copy:///<<BUILDDIR>>/resolver-mFPm5i/apt_archive/./Release.gpg: The key(s) in the keyring /etc/apt/trusted.gpg.d/sbuild-build-depends-archive.gpg are ignored as the file is not readable by user '_apt' executing apt-key.
W: GPG error: copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 35506D9A48F77B2E
Reading package lists...

Install core build dependencies (apt-based resolver)
----------------------------------------------------

Installing build dependencies
Reading package lists...
Building dependency tree...
Reading state information...
The following packages were automatically installed and are no longer required:
  libpam-cap netbase
Use 'apt autoremove' to remove them.
The following NEW packages will be installed:
  sbuild-build-depends-core-dummy
0 upgraded, 1 newly installed, 0 to remove and 34 not upgraded.
Need to get 852 B of archives.
After this operation, 0 B of additional disk space will be used.
Get:1 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ sbuild-build-depends-core-dummy 0.invalid.0 [852 B]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 852 B in 0s (0 B/s)
Selecting previously unselected package sbuild-build-depends-core-dummy.
(Reading database ... 12227 files and directories currently installed.)
Preparing to unpack .../sbuild-build-depends-core-dummy_0.invalid.0_armhf.deb ...
Unpacking sbuild-build-depends-core-dummy (0.invalid.0) ...
Setting up sbuild-build-depends-core-dummy (0.invalid.0) ...

+------------------------------------------------------------------------------+
| Check architectures                                                          |
+------------------------------------------------------------------------------+

Arch check ok (armhf included in any all)

+------------------------------------------------------------------------------+
| Install package build dependencies                                           |
+------------------------------------------------------------------------------+


Setup apt archive
-----------------

Merged Build-Depends: debhelper (>= 11~), bash-completion, dh-golang (>= 1.42~), golang-any (>= 2:1.13~), golang-github-asaskevich-govalidator-dev, golang-github-armon-circbuf-dev, golang-github-armon-go-metrics-dev (>= 0.0~git20171117~), golang-github-armon-go-radix-dev, golang-github-azure-go-autorest-dev (>= 10.15.5~), golang-github-bgentry-speakeasy-dev, golang-github-circonus-labs-circonus-gometrics-dev (>= 2.3.1~), golang-github-circonus-labs-circonusllhist-dev, golang-github-datadog-datadog-go-dev, golang-github-davecgh-go-spew-dev, golang-github-denverdino-aliyungo-dev, golang-github-digitalocean-godo-dev, golang-github-docker-go-connections-dev, golang-github-elazarl-go-bindata-assetfs-dev (>= 0.0~git20151224~), golang-github-ghodss-yaml-dev, golang-github-gogo-googleapis-dev, golang-github-gogo-protobuf-dev (>= 1.2.1~), golang-github-golang-snappy-dev, golang-github-googleapis-gnostic-dev, golang-github-google-gofuzz-dev, golang-github-gophercloud-gophercloud-dev, golang-github-gregjones-httpcache-dev, golang-github-hashicorp-go-checkpoint-dev, golang-github-hashicorp-go-cleanhttp-dev (>= 0.5.1~), golang-github-hashicorp-go-discover-dev, golang-github-hashicorp-go-hclog-dev (>= 0.9.2~), golang-github-hashicorp-go-immutable-radix-dev (>= 1.1.0~), golang-github-hashicorp-golang-lru-dev (>= 0.0~git20160207~), golang-github-hashicorp-go-memdb-dev (>= 0.0~git20180224~), golang-github-hashicorp-go-msgpack-dev (>= 0.5.5~), golang-github-hashicorp-go-multierror-dev, golang-github-hashicorp-go-raftchunking-dev, golang-github-hashicorp-go-reap-dev, golang-github-hashicorp-go-retryablehttp-dev, golang-github-hashicorp-go-rootcerts-dev, golang-github-hashicorp-go-sockaddr-dev, golang-github-hashicorp-go-syslog-dev, golang-github-hashicorp-go-uuid-dev, golang-github-hashicorp-go-version-dev, golang-github-hashicorp-hcl-dev, golang-github-hashicorp-hil-dev (>= 0.0~git20160711~), golang-github-hashicorp-logutils-dev, golang-github-hashicorp-memberlist-dev (>= 0.1.5~), golang-github-hashicorp-net-rpc-msgpackrpc-dev, golang-github-hashicorp-raft-boltdb-dev, golang-github-hashicorp-raft-dev (>= 1.1.1~), golang-github-hashicorp-scada-client-dev, golang-github-hashicorp-serf-dev (>= 0.8.4~), golang-github-hashicorp-yamux-dev (>= 0.0~git20151129~), golang-github-inconshreveable-muxado-dev, golang-github-imdario-mergo-dev, golang-github-jefferai-jsonx-dev, golang-github-json-iterator-go-dev, golang-github-kr-text-dev, golang-github-mattn-go-isatty-dev, golang-github-miekg-dns-dev, golang-github-mitchellh-cli-dev (>= 1.0.0~), golang-github-mitchellh-go-testing-interface-dev, golang-github-mitchellh-copystructure-dev, golang-github-mitchellh-hashstructure-dev, golang-github-mitchellh-mapstructure-dev, golang-github-mitchellh-reflectwalk-dev, golang-github-nytimes-gziphandler-dev, golang-github-packethost-packngo-dev, golang-github-pascaldekloe-goe-dev, golang-github-peterbourgon-diskv-dev, golang-github-pmezard-go-difflib-dev, golang-github-ryanuber-columnize-dev, golang-github-ryanuber-go-glob-dev, golang-github-shirou-gopsutil-dev, golang-github-spf13-pflag-dev, golang-golang-x-sys-dev (>= 0.0~git20161012~), golang-gopkg-inf.v0-dev, golang-gopkg-square-go-jose.v2-dev, mockery, golang-github-sap-go-hdb-dev
Filtered Build-Depends: debhelper (>= 11~), bash-completion, dh-golang (>= 1.42~), golang-any (>= 2:1.13~), golang-github-asaskevich-govalidator-dev, golang-github-armon-circbuf-dev, golang-github-armon-go-metrics-dev (>= 0.0~git20171117~), golang-github-armon-go-radix-dev, golang-github-azure-go-autorest-dev (>= 10.15.5~), golang-github-bgentry-speakeasy-dev, golang-github-circonus-labs-circonus-gometrics-dev (>= 2.3.1~), golang-github-circonus-labs-circonusllhist-dev, golang-github-datadog-datadog-go-dev, golang-github-davecgh-go-spew-dev, golang-github-denverdino-aliyungo-dev, golang-github-digitalocean-godo-dev, golang-github-docker-go-connections-dev, golang-github-elazarl-go-bindata-assetfs-dev (>= 0.0~git20151224~), golang-github-ghodss-yaml-dev, golang-github-gogo-googleapis-dev, golang-github-gogo-protobuf-dev (>= 1.2.1~), golang-github-golang-snappy-dev, golang-github-googleapis-gnostic-dev, golang-github-google-gofuzz-dev, golang-github-gophercloud-gophercloud-dev, golang-github-gregjones-httpcache-dev, golang-github-hashicorp-go-checkpoint-dev, golang-github-hashicorp-go-cleanhttp-dev (>= 0.5.1~), golang-github-hashicorp-go-discover-dev, golang-github-hashicorp-go-hclog-dev (>= 0.9.2~), golang-github-hashicorp-go-immutable-radix-dev (>= 1.1.0~), golang-github-hashicorp-golang-lru-dev (>= 0.0~git20160207~), golang-github-hashicorp-go-memdb-dev (>= 0.0~git20180224~), golang-github-hashicorp-go-msgpack-dev (>= 0.5.5~), golang-github-hashicorp-go-multierror-dev, golang-github-hashicorp-go-raftchunking-dev, golang-github-hashicorp-go-reap-dev, golang-github-hashicorp-go-retryablehttp-dev, golang-github-hashicorp-go-rootcerts-dev, golang-github-hashicorp-go-sockaddr-dev, golang-github-hashicorp-go-syslog-dev, golang-github-hashicorp-go-uuid-dev, golang-github-hashicorp-go-version-dev, golang-github-hashicorp-hcl-dev, golang-github-hashicorp-hil-dev (>= 0.0~git20160711~), golang-github-hashicorp-logutils-dev, golang-github-hashicorp-memberlist-dev (>= 0.1.5~), golang-github-hashicorp-net-rpc-msgpackrpc-dev, golang-github-hashicorp-raft-boltdb-dev, golang-github-hashicorp-raft-dev (>= 1.1.1~), golang-github-hashicorp-scada-client-dev, golang-github-hashicorp-serf-dev (>= 0.8.4~), golang-github-hashicorp-yamux-dev (>= 0.0~git20151129~), golang-github-inconshreveable-muxado-dev, golang-github-imdario-mergo-dev, golang-github-jefferai-jsonx-dev, golang-github-json-iterator-go-dev, golang-github-kr-text-dev, golang-github-mattn-go-isatty-dev, golang-github-miekg-dns-dev, golang-github-mitchellh-cli-dev (>= 1.0.0~), golang-github-mitchellh-go-testing-interface-dev, golang-github-mitchellh-copystructure-dev, golang-github-mitchellh-hashstructure-dev, golang-github-mitchellh-mapstructure-dev, golang-github-mitchellh-reflectwalk-dev, golang-github-nytimes-gziphandler-dev, golang-github-packethost-packngo-dev, golang-github-pascaldekloe-goe-dev, golang-github-peterbourgon-diskv-dev, golang-github-pmezard-go-difflib-dev, golang-github-ryanuber-columnize-dev, golang-github-ryanuber-go-glob-dev, golang-github-shirou-gopsutil-dev, golang-github-spf13-pflag-dev, golang-golang-x-sys-dev (>= 0.0~git20161012~), golang-gopkg-inf.v0-dev, golang-gopkg-square-go-jose.v2-dev, mockery, golang-github-sap-go-hdb-dev
dpkg-deb: building package 'sbuild-build-depends-consul-dummy' in '/<<BUILDDIR>>/resolver-mFPm5i/apt_archive/sbuild-build-depends-consul-dummy.deb'.
dpkg-scanpackages: warning: Packages in archive but missing from override file:
dpkg-scanpackages: warning:   sbuild-build-depends-consul-dummy sbuild-build-depends-core-dummy
dpkg-scanpackages: info: Wrote 2 entries to output Packages file.
gpg: using "Sbuild Signer" as default secret key for signing
Ign:1 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ InRelease
Get:2 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Release [969 B]
Get:3 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Release.gpg [370 B]
Ign:3 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Release.gpg
Get:4 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Sources [1369 B]
Get:5 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Packages [1457 B]
Fetched 4165 B in 1s (5187 B/s)
Reading package lists...
W: copy:///<<BUILDDIR>>/resolver-mFPm5i/apt_archive/./Release.gpg: The key(s) in the keyring /etc/apt/trusted.gpg.d/sbuild-build-depends-archive.gpg are ignored as the file is not readable by user '_apt' executing apt-key.
W: GPG error: copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 35506D9A48F77B2E
Reading package lists...

Install consul build dependencies (apt-based resolver)
------------------------------------------------------

Installing build dependencies
Reading package lists...
Building dependency tree...
Reading state information...
The following packages were automatically installed and are no longer required:
  libpam-cap netbase
Use 'apt autoremove' to remove them.
The following additional packages will be installed:
  autoconf automake autopoint autotools-dev bash-completion bsdmainutils
  ca-certificates debhelper dh-autoreconf dh-golang dh-strip-nondeterminism
  dwz file gettext gettext-base gogoprotobuf golang-1.13-go golang-1.13-src
  golang-any golang-dbus-dev golang-ginkgo-dev
  golang-github-alecthomas-units-dev golang-github-armon-circbuf-dev
  golang-github-armon-go-metrics-dev golang-github-armon-go-radix-dev
  golang-github-asaskevich-govalidator-dev golang-github-aws-aws-sdk-go-dev
  golang-github-azure-go-autorest-dev golang-github-beorn7-perks-dev
  golang-github-bgentry-speakeasy-dev golang-github-boltdb-bolt-dev
  golang-github-bradfitz-gomemcache-dev golang-github-cespare-xxhash-dev
  golang-github-circonus-labs-circonus-gometrics-dev
  golang-github-circonus-labs-circonusllhist-dev
  golang-github-coreos-go-systemd-dev golang-github-coreos-pkg-dev
  golang-github-cyphar-filepath-securejoin-dev
  golang-github-datadog-datadog-go-dev golang-github-davecgh-go-spew-dev
  golang-github-denverdino-aliyungo-dev golang-github-dgrijalva-jwt-go-dev
  golang-github-dgrijalva-jwt-go-v3-dev golang-github-digitalocean-godo-dev
  golang-github-dimchansky-utfbom-dev golang-github-docker-go-connections-dev
  golang-github-docker-go-units-dev golang-github-docopt-docopt-go-dev
  golang-github-elazarl-go-bindata-assetfs-dev golang-github-fatih-color-dev
  golang-github-garyburd-redigo-dev golang-github-ghodss-yaml-dev
  golang-github-go-ini-ini-dev golang-github-go-kit-kit-dev
  golang-github-go-logfmt-logfmt-dev golang-github-go-stack-stack-dev
  golang-github-go-test-deep-dev golang-github-gogo-googleapis-dev
  golang-github-gogo-protobuf-dev golang-github-golang-mock-dev
  golang-github-golang-snappy-dev golang-github-google-btree-dev
  golang-github-google-go-cmp-dev golang-github-google-go-querystring-dev
  golang-github-google-gofuzz-dev golang-github-googleapis-gnostic-dev
  golang-github-gophercloud-gophercloud-dev
  golang-github-gregjones-httpcache-dev golang-github-hashicorp-errwrap-dev
  golang-github-hashicorp-go-checkpoint-dev
  golang-github-hashicorp-go-cleanhttp-dev
  golang-github-hashicorp-go-discover-dev golang-github-hashicorp-go-hclog-dev
  golang-github-hashicorp-go-immutable-radix-dev
  golang-github-hashicorp-go-memdb-dev golang-github-hashicorp-go-msgpack-dev
  golang-github-hashicorp-go-multierror-dev
  golang-github-hashicorp-go-raftchunking-dev
  golang-github-hashicorp-go-reap-dev
  golang-github-hashicorp-go-retryablehttp-dev
  golang-github-hashicorp-go-rootcerts-dev
  golang-github-hashicorp-go-sockaddr-dev
  golang-github-hashicorp-go-syslog-dev golang-github-hashicorp-go-uuid-dev
  golang-github-hashicorp-go-version-dev
  golang-github-hashicorp-golang-lru-dev golang-github-hashicorp-hcl-dev
  golang-github-hashicorp-hil-dev golang-github-hashicorp-logutils-dev
  golang-github-hashicorp-mdns-dev golang-github-hashicorp-memberlist-dev
  golang-github-hashicorp-net-rpc-msgpackrpc-dev
  golang-github-hashicorp-raft-boltdb-dev golang-github-hashicorp-raft-dev
  golang-github-hashicorp-scada-client-dev golang-github-hashicorp-serf-dev
  golang-github-hashicorp-yamux-dev golang-github-imdario-mergo-dev
  golang-github-inconshreveable-muxado-dev golang-github-jeffail-gabs-dev
  golang-github-jefferai-jsonx-dev golang-github-jmespath-go-jmespath-dev
  golang-github-jpillora-backoff-dev golang-github-json-iterator-go-dev
  golang-github-julienschmidt-httprouter-dev golang-github-kr-pretty-dev
  golang-github-kr-pty-dev golang-github-kr-text-dev
  golang-github-mattn-go-colorable-dev golang-github-mattn-go-isatty-dev
  golang-github-miekg-dns-dev golang-github-mitchellh-cli-dev
  golang-github-mitchellh-copystructure-dev
  golang-github-mitchellh-go-homedir-dev
  golang-github-mitchellh-go-testing-interface-dev
  golang-github-mitchellh-hashstructure-dev
  golang-github-mitchellh-mapstructure-dev
  golang-github-mitchellh-reflectwalk-dev
  golang-github-modern-go-concurrent-dev golang-github-modern-go-reflect2-dev
  golang-github-mwitkow-go-conntrack-dev golang-github-nytimes-gziphandler-dev
  golang-github-opencontainers-runc-dev
  golang-github-opencontainers-selinux-dev
  golang-github-opencontainers-specs-dev
  golang-github-opentracing-opentracing-go-dev
  golang-github-packethost-packngo-dev golang-github-pascaldekloe-goe-dev
  golang-github-peterbourgon-diskv-dev golang-github-pkg-errors-dev
  golang-github-pmezard-go-difflib-dev golang-github-posener-complete-dev
  golang-github-prometheus-client-golang-dev
  golang-github-prometheus-client-model-dev
  golang-github-prometheus-common-dev golang-github-ryanuber-columnize-dev
  golang-github-ryanuber-go-glob-dev golang-github-sap-go-hdb-dev
  golang-github-seccomp-libseccomp-golang-dev
  golang-github-shirou-gopsutil-dev golang-github-sirupsen-logrus-dev
  golang-github-spf13-pflag-dev golang-github-stretchr-objx-dev
  golang-github-stretchr-testify-dev golang-github-syndtr-goleveldb-dev
  golang-github-tent-http-link-go-dev golang-github-tv42-httpunix-dev
  golang-github-ugorji-go-codec-dev golang-github-ugorji-go-msgpack-dev
  golang-github-urfave-cli-dev golang-github-vishvananda-netlink-dev
  golang-github-vishvananda-netns-dev golang-github-vmware-govmomi-dev
  golang-github-xeipuuv-gojsonpointer-dev
  golang-github-xeipuuv-gojsonreference-dev
  golang-github-xeipuuv-gojsonschema-dev golang-glog-dev golang-go
  golang-go.opencensus-dev golang-gocapability-dev golang-gogoprotobuf-dev
  golang-golang-x-crypto-dev golang-golang-x-net-dev
  golang-golang-x-oauth2-dev golang-golang-x-oauth2-google-dev
  golang-golang-x-sync-dev golang-golang-x-sys-dev golang-golang-x-text-dev
  golang-golang-x-time-dev golang-golang-x-tools golang-golang-x-tools-dev
  golang-golang-x-xerrors-dev golang-gomega-dev golang-google-api-dev
  golang-google-cloud-compute-metadata-dev golang-google-genproto-dev
  golang-google-grpc-dev golang-gopkg-alecthomas-kingpin.v2-dev
  golang-gopkg-check.v1-dev golang-gopkg-inf.v0-dev golang-gopkg-mgo.v2-dev
  golang-gopkg-square-go-jose.v2-dev golang-gopkg-tomb.v2-dev
  golang-gopkg-vmihailenco-msgpack.v2-dev golang-gopkg-yaml.v2-dev
  golang-goprotobuf-dev golang-procfs-dev golang-protobuf-extensions-dev
  golang-src groff-base intltool-debian iproute2 libarchive-zip-perl libbsd0
  libcroco3 libdebhelper-perl libelf1 libfile-stripnondeterminism-perl
  libglib2.0-0 libicu63 libjs-jquery libjs-jquery-ui libmagic-mgc libmagic1
  libmnl0 libncurses6 libpipeline1 libprocps7 libprotobuf-dev
  libprotobuf-lite17 libprotobuf17 libprotoc17 libsasl2-dev libseccomp-dev
  libseccomp2 libsigsegv2 libssl1.1 libsub-override-perl libsystemd-dev
  libsystemd0 libtinfo5 libtool libuchardet0 libxml2 libxtables12 lsof m4
  man-db mockery openssl pkg-config po-debconf procps protobuf-compiler
  sensible-utils zlib1g-dev
Suggested packages:
  autoconf-archive gnu-standards autoconf-doc wamerican | wordlist whois
  vacation dh-make gettext-doc libasprintf-dev libgettextpo-dev bzr | brz git
  mercurial subversion mockgen golang-google-appengine-dev groff iproute2-doc
  libjs-jquery-ui-docs seccomp libtool-doc gfortran | fortran95-compiler
  gcj-jdk m4-doc apparmor less www-browser libmail-box-perl
Recommended packages:
  curl | wget | lynx golang-doc libatm1 libarchive-cpio-perl libglib2.0-data
  shared-mime-info xdg-user-dirs javascript-common libgpm2 libltdl-dev
  libmail-sendmail-perl psmisc
The following NEW packages will be installed:
  autoconf automake autopoint autotools-dev bash-completion bsdmainutils
  ca-certificates debhelper dh-autoreconf dh-golang dh-strip-nondeterminism
  dwz file gettext gettext-base gogoprotobuf golang-1.13-go golang-1.13-src
  golang-any golang-dbus-dev golang-ginkgo-dev
  golang-github-alecthomas-units-dev golang-github-armon-circbuf-dev
  golang-github-armon-go-metrics-dev golang-github-armon-go-radix-dev
  golang-github-asaskevich-govalidator-dev golang-github-aws-aws-sdk-go-dev
  golang-github-azure-go-autorest-dev golang-github-beorn7-perks-dev
  golang-github-bgentry-speakeasy-dev golang-github-boltdb-bolt-dev
  golang-github-bradfitz-gomemcache-dev golang-github-cespare-xxhash-dev
  golang-github-circonus-labs-circonus-gometrics-dev
  golang-github-circonus-labs-circonusllhist-dev
  golang-github-coreos-go-systemd-dev golang-github-coreos-pkg-dev
  golang-github-cyphar-filepath-securejoin-dev
  golang-github-datadog-datadog-go-dev golang-github-davecgh-go-spew-dev
  golang-github-denverdino-aliyungo-dev golang-github-dgrijalva-jwt-go-dev
  golang-github-dgrijalva-jwt-go-v3-dev golang-github-digitalocean-godo-dev
  golang-github-dimchansky-utfbom-dev golang-github-docker-go-connections-dev
  golang-github-docker-go-units-dev golang-github-docopt-docopt-go-dev
  golang-github-elazarl-go-bindata-assetfs-dev golang-github-fatih-color-dev
  golang-github-garyburd-redigo-dev golang-github-ghodss-yaml-dev
  golang-github-go-ini-ini-dev golang-github-go-kit-kit-dev
  golang-github-go-logfmt-logfmt-dev golang-github-go-stack-stack-dev
  golang-github-go-test-deep-dev golang-github-gogo-googleapis-dev
  golang-github-gogo-protobuf-dev golang-github-golang-mock-dev
  golang-github-golang-snappy-dev golang-github-google-btree-dev
  golang-github-google-go-cmp-dev golang-github-google-go-querystring-dev
  golang-github-google-gofuzz-dev golang-github-googleapis-gnostic-dev
  golang-github-gophercloud-gophercloud-dev
  golang-github-gregjones-httpcache-dev golang-github-hashicorp-errwrap-dev
  golang-github-hashicorp-go-checkpoint-dev
  golang-github-hashicorp-go-cleanhttp-dev
  golang-github-hashicorp-go-discover-dev golang-github-hashicorp-go-hclog-dev
  golang-github-hashicorp-go-immutable-radix-dev
  golang-github-hashicorp-go-memdb-dev golang-github-hashicorp-go-msgpack-dev
  golang-github-hashicorp-go-multierror-dev
  golang-github-hashicorp-go-raftchunking-dev
  golang-github-hashicorp-go-reap-dev
  golang-github-hashicorp-go-retryablehttp-dev
  golang-github-hashicorp-go-rootcerts-dev
  golang-github-hashicorp-go-sockaddr-dev
  golang-github-hashicorp-go-syslog-dev golang-github-hashicorp-go-uuid-dev
  golang-github-hashicorp-go-version-dev
  golang-github-hashicorp-golang-lru-dev golang-github-hashicorp-hcl-dev
  golang-github-hashicorp-hil-dev golang-github-hashicorp-logutils-dev
  golang-github-hashicorp-mdns-dev golang-github-hashicorp-memberlist-dev
  golang-github-hashicorp-net-rpc-msgpackrpc-dev
  golang-github-hashicorp-raft-boltdb-dev golang-github-hashicorp-raft-dev
  golang-github-hashicorp-scada-client-dev golang-github-hashicorp-serf-dev
  golang-github-hashicorp-yamux-dev golang-github-imdario-mergo-dev
  golang-github-inconshreveable-muxado-dev golang-github-jeffail-gabs-dev
  golang-github-jefferai-jsonx-dev golang-github-jmespath-go-jmespath-dev
  golang-github-jpillora-backoff-dev golang-github-json-iterator-go-dev
  golang-github-julienschmidt-httprouter-dev golang-github-kr-pretty-dev
  golang-github-kr-pty-dev golang-github-kr-text-dev
  golang-github-mattn-go-colorable-dev golang-github-mattn-go-isatty-dev
  golang-github-miekg-dns-dev golang-github-mitchellh-cli-dev
  golang-github-mitchellh-copystructure-dev
  golang-github-mitchellh-go-homedir-dev
  golang-github-mitchellh-go-testing-interface-dev
  golang-github-mitchellh-hashstructure-dev
  golang-github-mitchellh-mapstructure-dev
  golang-github-mitchellh-reflectwalk-dev
  golang-github-modern-go-concurrent-dev golang-github-modern-go-reflect2-dev
  golang-github-mwitkow-go-conntrack-dev golang-github-nytimes-gziphandler-dev
  golang-github-opencontainers-runc-dev
  golang-github-opencontainers-selinux-dev
  golang-github-opencontainers-specs-dev
  golang-github-opentracing-opentracing-go-dev
  golang-github-packethost-packngo-dev golang-github-pascaldekloe-goe-dev
  golang-github-peterbourgon-diskv-dev golang-github-pkg-errors-dev
  golang-github-pmezard-go-difflib-dev golang-github-posener-complete-dev
  golang-github-prometheus-client-golang-dev
  golang-github-prometheus-client-model-dev
  golang-github-prometheus-common-dev golang-github-ryanuber-columnize-dev
  golang-github-ryanuber-go-glob-dev golang-github-sap-go-hdb-dev
  golang-github-seccomp-libseccomp-golang-dev
  golang-github-shirou-gopsutil-dev golang-github-sirupsen-logrus-dev
  golang-github-spf13-pflag-dev golang-github-stretchr-objx-dev
  golang-github-stretchr-testify-dev golang-github-syndtr-goleveldb-dev
  golang-github-tent-http-link-go-dev golang-github-tv42-httpunix-dev
  golang-github-ugorji-go-codec-dev golang-github-ugorji-go-msgpack-dev
  golang-github-urfave-cli-dev golang-github-vishvananda-netlink-dev
  golang-github-vishvananda-netns-dev golang-github-vmware-govmomi-dev
  golang-github-xeipuuv-gojsonpointer-dev
  golang-github-xeipuuv-gojsonreference-dev
  golang-github-xeipuuv-gojsonschema-dev golang-glog-dev golang-go
  golang-go.opencensus-dev golang-gocapability-dev golang-gogoprotobuf-dev
  golang-golang-x-crypto-dev golang-golang-x-net-dev
  golang-golang-x-oauth2-dev golang-golang-x-oauth2-google-dev
  golang-golang-x-sync-dev golang-golang-x-sys-dev golang-golang-x-text-dev
  golang-golang-x-time-dev golang-golang-x-tools golang-golang-x-tools-dev
  golang-golang-x-xerrors-dev golang-gomega-dev golang-google-api-dev
  golang-google-cloud-compute-metadata-dev golang-google-genproto-dev
  golang-google-grpc-dev golang-gopkg-alecthomas-kingpin.v2-dev
  golang-gopkg-check.v1-dev golang-gopkg-inf.v0-dev golang-gopkg-mgo.v2-dev
  golang-gopkg-square-go-jose.v2-dev golang-gopkg-tomb.v2-dev
  golang-gopkg-vmihailenco-msgpack.v2-dev golang-gopkg-yaml.v2-dev
  golang-goprotobuf-dev golang-procfs-dev golang-protobuf-extensions-dev
  golang-src groff-base intltool-debian iproute2 libarchive-zip-perl libbsd0
  libcroco3 libdebhelper-perl libelf1 libfile-stripnondeterminism-perl
  libglib2.0-0 libicu63 libjs-jquery libjs-jquery-ui libmagic-mgc libmagic1
  libmnl0 libncurses6 libpipeline1 libprocps7 libprotobuf-dev
  libprotobuf-lite17 libprotobuf17 libprotoc17 libsasl2-dev libseccomp-dev
  libsigsegv2 libssl1.1 libsub-override-perl libsystemd-dev libtinfo5 libtool
  libuchardet0 libxml2 libxtables12 lsof m4 man-db mockery openssl pkg-config
  po-debconf procps protobuf-compiler sbuild-build-depends-consul-dummy
  sensible-utils zlib1g-dev
The following packages will be upgraded:
  libseccomp2 libsystemd0
2 upgraded, 235 newly installed, 0 to remove and 32 not upgraded.
Need to get 158 MB of archives.
After this operation, 985 MB of additional disk space will be used.
Get:1 copy:/<<BUILDDIR>>/resolver-mFPm5i/apt_archive ./ sbuild-build-depends-consul-dummy 0.invalid.0 [1704 B]
Get:2 http://172.17.0.1/private bullseye-staging/main armhf libsystemd0 armhf 243-8+rpi1 [310 kB]
Get:3 http://172.17.0.1/private bullseye-staging/main armhf libbsd0 armhf 0.10.0-1 [112 kB]
Get:4 http://172.17.0.1/private bullseye-staging/main armhf libtinfo5 armhf 6.1+20191019-1 [316 kB]
Get:5 http://172.17.0.1/private bullseye-staging/main armhf bsdmainutils armhf 11.1.2 [182 kB]
Get:6 http://172.17.0.1/private bullseye-staging/main armhf libuchardet0 armhf 0.0.6-3 [62.2 kB]
Get:7 http://172.17.0.1/private bullseye-staging/main armhf groff-base armhf 1.22.4-3 [782 kB]
Get:8 http://172.17.0.1/private bullseye-staging/main armhf libpipeline1 armhf 1.5.1-2 [26.6 kB]
Get:9 http://172.17.0.1/private bullseye-staging/main armhf libseccomp2 armhf 2.4.2-2+rpi1 [36.1 kB]
Get:10 http://172.17.0.1/private bullseye-staging/main armhf man-db armhf 2.9.0-1 [1261 kB]
Get:11 http://172.17.0.1/private bullseye-staging/main armhf golang-github-davecgh-go-spew-dev all 1.1.1-2 [29.7 kB]
Get:12 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pmezard-go-difflib-dev all 1.0.0-2 [12.0 kB]
Get:13 http://172.17.0.1/private bullseye-staging/main armhf golang-github-stretchr-objx-dev all 0.1.1+git20180825.ef50b0d-1 [23.4 kB]
Get:14 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-pty-dev all 1.1.6-1 [10.6 kB]
Get:15 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-text-dev all 0.1.0-1 [10.8 kB]
Get:16 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-pretty-dev all 0.1.0-1 [10.2 kB]
Get:17 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-check.v1-dev all 0.0+git20180628.788fd78-1 [31.6 kB]
Get:18 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-yaml.v2-dev all 2.2.2-1 [58.9 kB]
Get:19 http://172.17.0.1/private bullseye-staging/main armhf golang-github-stretchr-testify-dev all 1.4.0+ds-1 [53.5 kB]
Get:20 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-sys-dev all 0.0~git20190726.fc99dfb-1 [395 kB]
Get:21 http://172.17.0.1/private bullseye-staging/main armhf golang-github-sirupsen-logrus-dev all 1.4.2-1 [41.2 kB]
Get:22 http://172.17.0.1/private bullseye-staging/main armhf libelf1 armhf 0.176-1.1 [158 kB]
Get:23 http://172.17.0.1/private bullseye-staging/main armhf libmnl0 armhf 1.0.4-2 [11.3 kB]
Get:24 http://172.17.0.1/private bullseye-staging/main armhf libxtables12 armhf 1.8.3-2 [77.3 kB]
Get:25 http://172.17.0.1/private bullseye-staging/main armhf iproute2 armhf 5.4.0-1 [762 kB]
Get:26 http://172.17.0.1/private bullseye-staging/main armhf libncurses6 armhf 6.1+20191019-1 [79.5 kB]
Get:27 http://172.17.0.1/private bullseye-staging/main armhf libprocps7 armhf 2:3.3.15-2 [58.9 kB]
Get:28 http://172.17.0.1/private bullseye-staging/main armhf procps armhf 2:3.3.15-2 [235 kB]
Get:29 http://172.17.0.1/private bullseye-staging/main armhf sensible-utils all 0.0.12+nmu1 [16.0 kB]
Get:30 http://172.17.0.1/private bullseye-staging/main armhf bash-completion all 1:2.8-6 [208 kB]
Get:31 http://172.17.0.1/private bullseye-staging/main armhf libmagic-mgc armhf 1:5.37-6 [253 kB]
Get:32 http://172.17.0.1/private bullseye-staging/main armhf libmagic1 armhf 1:5.37-6 [111 kB]
Get:33 http://172.17.0.1/private bullseye-staging/main armhf file armhf 1:5.37-6 [66.2 kB]
Get:34 http://172.17.0.1/private bullseye-staging/main armhf gettext-base armhf 0.19.8.1-10 [117 kB]
Get:35 http://172.17.0.1/private bullseye-staging/main armhf lsof armhf 4.93.2+dfsg-1 [307 kB]
Get:36 http://172.17.0.1/private bullseye-staging/main armhf libsigsegv2 armhf 2.12-2 [32.3 kB]
Get:37 http://172.17.0.1/private bullseye-staging/main armhf m4 armhf 1.4.18-4 [185 kB]
Get:38 http://172.17.0.1/private bullseye-staging/main armhf autoconf all 2.69-11 [341 kB]
Get:39 http://172.17.0.1/private bullseye-staging/main armhf autotools-dev all 20180224.1 [77.0 kB]
Get:40 http://172.17.0.1/private bullseye-staging/main armhf automake all 1:1.16.1-4 [771 kB]
Get:41 http://172.17.0.1/private bullseye-staging/main armhf autopoint all 0.19.8.1-10 [435 kB]
Get:42 http://172.17.0.1/private bullseye-staging/main armhf libssl1.1 armhf 1.1.1d-2 [1268 kB]
Get:43 http://172.17.0.1/private bullseye-staging/main armhf openssl armhf 1.1.1d-2 [806 kB]
Get:44 http://172.17.0.1/private bullseye-staging/main armhf ca-certificates all 20190110 [157 kB]
Get:45 http://172.17.0.1/private bullseye-staging/main armhf libtool all 2.4.6-11 [547 kB]
Get:46 http://172.17.0.1/private bullseye-staging/main armhf dh-autoreconf all 19 [16.9 kB]
Get:47 http://172.17.0.1/private bullseye-staging/main armhf libdebhelper-perl all 12.7.1 [173 kB]
Get:48 http://172.17.0.1/private bullseye-staging/main armhf libarchive-zip-perl all 1.67-1 [104 kB]
Get:49 http://172.17.0.1/private bullseye-staging/main armhf libsub-override-perl all 0.09-2 [10.2 kB]
Get:50 http://172.17.0.1/private bullseye-staging/main armhf libfile-stripnondeterminism-perl all 1.6.3-1 [23.6 kB]
Get:51 http://172.17.0.1/private bullseye-staging/main armhf dh-strip-nondeterminism all 1.6.3-1 [14.6 kB]
Get:52 http://172.17.0.1/private bullseye-staging/main armhf dwz armhf 0.13-4 [140 kB]
Get:53 http://172.17.0.1/private bullseye-staging/main armhf libglib2.0-0 armhf 2.62.3-2 [1137 kB]
Get:54 http://172.17.0.1/private bullseye-staging/main armhf libicu63 armhf 63.2-2 [7974 kB]
Get:55 http://172.17.0.1/private bullseye-staging/main armhf libxml2 armhf 2.9.4+dfsg1-8 [593 kB]
Get:56 http://172.17.0.1/private bullseye-staging/main armhf libcroco3 armhf 0.6.13-1 [133 kB]
Get:57 http://172.17.0.1/private bullseye-staging/main armhf gettext armhf 0.19.8.1-10 [1219 kB]
Get:58 http://172.17.0.1/private bullseye-staging/main armhf intltool-debian all 0.35.0+20060710.5 [26.8 kB]
Get:59 http://172.17.0.1/private bullseye-staging/main armhf po-debconf all 1.0.21 [248 kB]
Get:60 http://172.17.0.1/private bullseye-staging/main armhf debhelper all 12.7.1 [997 kB]
Get:61 http://172.17.0.1/private bullseye-staging/main armhf dh-golang all 1.43 [22.4 kB]
Get:62 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gogo-protobuf-dev all 1.2.1+git20190611.dadb6258-1 [863 kB]
Get:63 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf17 armhf 3.6.1.3-2+rpi1 [665 kB]
Get:64 http://172.17.0.1/private bullseye-staging/main armhf libprotoc17 armhf 3.6.1.3-2+rpi1 [546 kB]
Get:65 http://172.17.0.1/private bullseye-staging/main armhf protobuf-compiler armhf 3.6.1.3-2+rpi1 [64.5 kB]
Get:66 http://172.17.0.1/private bullseye-staging/main armhf gogoprotobuf armhf 1.2.1+git20190611.dadb6258-1 [5285 kB]
Get:67 http://172.17.0.1/private bullseye-staging/main armhf golang-1.13-src armhf 1.13.4-1+rpi1 [12.7 MB]
Get:68 http://172.17.0.1/private bullseye-staging/main armhf golang-1.13-go armhf 1.13.4-1+rpi1 [43.5 MB]
Get:69 http://172.17.0.1/private bullseye-staging/main armhf golang-src armhf 2:1.13~1+b11 [4892 B]
Get:70 http://172.17.0.1/private bullseye-staging/main armhf golang-go armhf 2:1.13~1+b11 [23.9 kB]
Get:71 http://172.17.0.1/private bullseye-staging/main armhf golang-any armhf 2:1.13~1+b11 [5012 B]
Get:72 http://172.17.0.1/private bullseye-staging/main armhf golang-dbus-dev all 5.0.2-1 [54.7 kB]
Get:73 http://172.17.0.1/private bullseye-staging/main armhf golang-github-alecthomas-units-dev all 0.0~git20151022.0.2efee85-4 [5816 B]
Get:74 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-circbuf-dev all 0.0~git20150827.0.bbbad09-2 [3952 B]
Get:75 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pkg-errors-dev all 0.8.1-1 [11.2 kB]
Get:76 http://172.17.0.1/private bullseye-staging/main armhf golang-github-circonus-labs-circonusllhist-dev all 0.0~git20160526.0.d724266-2 [6974 B]
Get:77 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-cleanhttp-dev all 0.5.1-1 [10.4 kB]
Get:78 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mattn-go-isatty-dev all 0.0.8-2 [5864 B]
Get:79 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mattn-go-colorable-dev all 0.0.9-3 [7960 B]
Get:80 http://172.17.0.1/private bullseye-staging/main armhf golang-github-fatih-color-dev all 1.5.0-1 [11.1 kB]
Get:81 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-hclog-dev all 0.10.0-1 [17.7 kB]
Get:82 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-retryablehttp-dev all 0.6.4-1 [17.3 kB]
Get:83 http://172.17.0.1/private bullseye-staging/main armhf golang-github-tv42-httpunix-dev all 0.0~git20150427.b75d861-2 [3744 B]
Get:84 http://172.17.0.1/private bullseye-staging/main armhf golang-github-circonus-labs-circonus-gometrics-dev all 2.3.1-2 [64.4 kB]
Get:85 http://172.17.0.1/private bullseye-staging/main armhf golang-github-datadog-datadog-go-dev all 2.1.0-2 [14.7 kB]
Get:86 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-uuid-dev all 1.0.1-1 [8476 B]
Get:87 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-golang-lru-dev all 0.5.0-1 [14.0 kB]
Get:88 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-immutable-radix-dev all 1.1.0-1 [22.8 kB]
Get:89 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pascaldekloe-goe-dev all 0.1.0-2 [21.7 kB]
Get:90 http://172.17.0.1/private bullseye-staging/main armhf golang-github-beorn7-perks-dev all 0.0~git20160804.0.4c0e845-1 [11.6 kB]
Get:91 http://172.17.0.1/private bullseye-staging/main armhf golang-github-cespare-xxhash-dev all 2.1.0-1 [8696 B]
Get:92 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-gofuzz-dev all 0.0~git20170612.24818f7-1 [9108 B]
Get:93 http://172.17.0.1/private bullseye-staging/main armhf golang-github-modern-go-concurrent-dev all 1.0.3-1 [4520 B]
Get:94 http://172.17.0.1/private bullseye-staging/main armhf golang-github-modern-go-reflect2-dev all 1.0.0-1 [14.4 kB]
Get:95 http://172.17.0.1/private bullseye-staging/main armhf golang-github-json-iterator-go-dev all 1.1.4-1 [62.6 kB]
Get:96 http://172.17.0.1/private bullseye-staging/main armhf zlib1g-dev armhf 1:1.2.11.dfsg-1 [206 kB]
Get:97 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf-lite17 armhf 3.6.1.3-2+rpi1 [147 kB]
Get:98 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf-dev armhf 3.6.1.3-2+rpi1 [1001 kB]
Get:99 http://172.17.0.1/private bullseye-staging/main armhf golang-goprotobuf-dev armhf 1.3.2-2 [1369 kB]
Get:100 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-client-model-dev all 0.0.2+git20171117.99fa1f4-1 [19.3 kB]
Get:101 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dgrijalva-jwt-go-v3-dev all 3.2.0-2 [32.4 kB]
Get:102 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-logfmt-logfmt-dev all 0.3.0-1 [12.5 kB]
Get:103 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-stack-stack-dev all 1.5.2-2 [6956 B]
Get:104 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-sync-dev all 0.0~git20190423.1122301-1 [17.1 kB]
Get:105 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-xerrors-dev all 0.0~git20190717.a985d34-1 [12.8 kB]
Get:106 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-tools-dev all 1:0.0~git20191118.07fc4c7+ds-1 [1396 kB]
Get:107 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-text-dev all 0.3.2-1 [3689 kB]
Get:108 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-net-dev all 1:0.0+git20191112.2180aed+dfsg-1 [637 kB]
Get:109 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opentracing-opentracing-go-dev all 1.0.2-1 [21.8 kB]
Get:110 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-time-dev all 0.0~git20161028.0.f51c127-2 [9396 B]
Get:111 http://172.17.0.1/private bullseye-staging/main armhf golang-github-golang-mock-dev all 1.3.1-2 [35.1 kB]
Get:112 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-go-cmp-dev all 0.3.1-1 [65.2 kB]
Get:113 http://172.17.0.1/private bullseye-staging/main armhf golang-glog-dev all 0.0~git20160126.23def4e-3 [17.3 kB]
Get:114 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-oauth2-dev all 0.0~git20190604.0f29369-2 [31.9 kB]
Get:115 http://172.17.0.1/private bullseye-staging/main armhf golang-google-cloud-compute-metadata-dev all 0.43.0-1 [31.1 kB]
Get:116 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-oauth2-google-dev all 0.0~git20190604.0f29369-2 [13.2 kB]
Get:117 http://172.17.0.1/private bullseye-staging/main armhf golang-google-genproto-dev all 0.0~git20190801.fa694d8-2 [2897 kB]
Get:118 http://172.17.0.1/private bullseye-staging/main armhf golang-google-grpc-dev all 1.22.1-1 [493 kB]
Get:119 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-kit-kit-dev all 0.6.0-2 [103 kB]
Get:120 http://172.17.0.1/private bullseye-staging/main armhf golang-github-julienschmidt-httprouter-dev all 1.1-5 [16.0 kB]
Get:121 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jpillora-backoff-dev all 1.0.0-1 [3580 B]
Get:122 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mwitkow-go-conntrack-dev all 0.0~git20190716.2f06839-1 [14.4 kB]
Get:123 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-alecthomas-kingpin.v2-dev all 2.2.6-1 [42.2 kB]
Get:124 http://172.17.0.1/private bullseye-staging/main armhf golang-protobuf-extensions-dev all 1.0.1-1 [29.6 kB]
Get:125 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-common-dev all 0.7.0-1 [83.8 kB]
Get:126 http://172.17.0.1/private bullseye-staging/main armhf golang-procfs-dev all 0.0.3-1 [78.0 kB]
Get:127 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-client-golang-dev all 1.2.1-3 [106 kB]
Get:128 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-go-metrics-dev all 0.0~git20190430.ec5e00d-1 [25.9 kB]
Get:129 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-go-radix-dev all 1.0.0-1 [7420 B]
Get:130 http://172.17.0.1/private bullseye-staging/main armhf golang-github-asaskevich-govalidator-dev all 9+git20180720.0.f9ffefc3-1 [41.2 kB]
Get:131 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-ini-ini-dev all 1.32.0-2 [32.7 kB]
Get:132 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jmespath-go-jmespath-dev all 0.2.2-3 [18.7 kB]
Get:133 http://172.17.0.1/private bullseye-staging/main armhf golang-github-aws-aws-sdk-go-dev all 1.21.6+dfsg-2 [4969 kB]
Get:134 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dgrijalva-jwt-go-dev all 3.2.0-1 [32.5 kB]
Get:135 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dimchansky-utfbom-dev all 0.0~git20170328.6c6132f-1 [4712 B]
Get:136 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-go-homedir-dev all 1.1.0-1 [5168 B]
Get:137 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-crypto-dev all 1:0.0~git20190701.4def268-2 [1505 kB]
Get:138 http://172.17.0.1/private bullseye-staging/main armhf golang-github-azure-go-autorest-dev all 10.15.5-1 [99.2 kB]
Get:139 http://172.17.0.1/private bullseye-staging/main armhf golang-github-bgentry-speakeasy-dev all 0.1.0-1 [5110 B]
Get:140 http://172.17.0.1/private bullseye-staging/main armhf golang-github-boltdb-bolt-dev all 1.3.1-6 [60.6 kB]
Get:141 http://172.17.0.1/private bullseye-staging/main armhf golang-github-bradfitz-gomemcache-dev all 0.0~git20141109-3 [10.3 kB]
Get:142 http://172.17.0.1/private bullseye-staging/main armhf golang-github-coreos-pkg-dev all 4-2 [25.1 kB]
Get:143 http://172.17.0.1/private bullseye-staging/main armhf libsystemd-dev armhf 243-8+rpi1 [331 kB]
Get:144 http://172.17.0.1/private bullseye-staging/main armhf pkg-config armhf 0.29-6 [59.8 kB]
Get:145 http://172.17.0.1/private bullseye-staging/main armhf golang-github-coreos-go-systemd-dev all 20-1 [50.7 kB]
Get:146 http://172.17.0.1/private bullseye-staging/main armhf golang-github-cyphar-filepath-securejoin-dev all 0.2.2-1 [7196 B]
Get:147 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-go-querystring-dev all 1.0.0-1 [7456 B]
Get:148 http://172.17.0.1/private bullseye-staging/main armhf golang-github-tent-http-link-go-dev all 0.0~git20130702.0.ac974c6-6 [5016 B]
Get:149 http://172.17.0.1/private bullseye-staging/main armhf golang-github-digitalocean-godo-dev all 1.1.0-1 [42.6 kB]
Get:150 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docker-go-units-dev all 0.4.0-1 [7536 B]
Get:151 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-selinux-dev all 1.3.0-2 [13.3 kB]
Get:152 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonpointer-dev all 0.0~git20151027.0.e0fe6f6-2 [4620 B]
Get:153 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonreference-dev all 0.0~git20150808.0.e02fc20-2 [4592 B]
Get:154 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonschema-dev all 0.0~git20170210.0.6b67b3f-2 [25.3 kB]
Get:155 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-specs-dev all 1.0.1+git20190408.a1b50f6-1 [27.7 kB]
Get:156 http://172.17.0.1/private bullseye-staging/main armhf libseccomp-dev armhf 2.4.2-2+rpi1 [69.5 kB]
Get:157 http://172.17.0.1/private bullseye-staging/main armhf golang-github-seccomp-libseccomp-golang-dev all 0.9.1-1 [16.1 kB]
Get:158 http://172.17.0.1/private bullseye-staging/main armhf golang-github-urfave-cli-dev all 1.20.0-1 [51.0 kB]
Get:159 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vishvananda-netns-dev all 0.0~git20170707.0.86bef33-1 [5646 B]
Get:160 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vishvananda-netlink-dev all 1.0.0+git20181030.023a6da-1 [106 kB]
Get:161 http://172.17.0.1/private bullseye-staging/main armhf golang-gocapability-dev all 0.0+git20180916.d983527-1 [11.8 kB]
Get:162 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-runc-dev all 1.0.0~rc9+dfsg1-1+rpi1 [178 kB]
Get:163 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docker-go-connections-dev all 0.4.0-1 [26.3 kB]
Get:164 http://172.17.0.1/private bullseye-staging/main armhf golang-github-elazarl-go-bindata-assetfs-dev all 1.0.0-1 [5460 B]
Get:165 http://172.17.0.1/private bullseye-staging/main armhf golang-github-garyburd-redigo-dev all 0.0~git20150901.0.d8dbe4d-2 [28.0 kB]
Get:166 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ghodss-yaml-dev all 1.0.0-1 [12.9 kB]
Get:167 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-test-deep-dev all 1.0.3-1 [9876 B]
Get:168 http://172.17.0.1/private bullseye-staging/main armhf golang-gogoprotobuf-dev all 1.2.1+git20190611.dadb6258-1 [5340 B]
Get:169 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gogo-googleapis-dev all 1.2.0-1 [30.4 kB]
Get:170 http://172.17.0.1/private bullseye-staging/main armhf golang-github-golang-snappy-dev all 0.0+git20160529.d9eb7a3-3 [51.2 kB]
Get:171 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-btree-dev all 1.0.0-1 [13.2 kB]
Get:172 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docopt-docopt-go-dev all 0.6.2+git20160216.0.784ddc5-1 [9434 B]
Get:173 http://172.17.0.1/private bullseye-staging/main armhf golang-github-googleapis-gnostic-dev all 0.2.0-1 [74.4 kB]
Get:174 http://172.17.0.1/private bullseye-staging/main armhf golang-github-peterbourgon-diskv-dev all 3.0.0-1 [18.8 kB]
Get:175 http://172.17.0.1/private bullseye-staging/main armhf golang-gomega-dev all 1.0+git20160910.d59fa0a-1 [63.7 kB]
Get:176 http://172.17.0.1/private bullseye-staging/main armhf golang-ginkgo-dev armhf 1.2.0+git20161006.acfa16a-1 [1535 kB]
Get:177 http://172.17.0.1/private bullseye-staging/main armhf golang-github-syndtr-goleveldb-dev all 0.0~git20170725.0.b89cc31-2 [116 kB]
Get:178 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gregjones-httpcache-dev all 0.0~git20180305.9cad4c3-1 [13.6 kB]
Get:179 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-errwrap-dev all 1.0.0-1 [10.3 kB]
Get:180 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-checkpoint-dev all 0.0~git20171009.1545e56-2 [8184 B]
Get:181 http://172.17.0.1/private bullseye-staging/main armhf golang-github-denverdino-aliyungo-dev all 0.0~git20180921.13fa8aa-2 [125 kB]
Get:182 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gophercloud-gophercloud-dev all 0.6.0-1 [570 kB]
Get:183 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-multierror-dev all 1.0.0-1 [10.6 kB]
Get:184 http://172.17.0.1/private bullseye-staging/main armhf golang-github-miekg-dns-dev all 1.0.4+ds-1 [126 kB]
Get:185 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-mdns-dev all 1.0.1-1 [11.9 kB]
Get:186 http://172.17.0.1/private bullseye-staging/main armhf golang-github-packethost-packngo-dev all 0.2.0-2 [40.7 kB]
Get:187 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vmware-govmomi-dev all 0.15.0-1 [10.2 MB]
Get:188 http://172.17.0.1/private bullseye-staging/main armhf golang-go.opencensus-dev all 0.22.0-1 [120 kB]
Get:189 http://172.17.0.1/private bullseye-staging/main armhf golang-google-api-dev all 0.7.0-2 [2971 kB]
Get:190 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-discover-dev all 0.0+git20190905.34a6505-2 [26.7 kB]
Get:191 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-memdb-dev all 0.0~git20180224.1289e7ff-1 [27.1 kB]
Get:192 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ugorji-go-msgpack-dev all 0.0~git20130605.792643-5 [20.7 kB]
Get:193 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ugorji-go-codec-dev all 1.1.7-1 [201 kB]
Get:194 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-vmihailenco-msgpack.v2-dev all 3.3.3-1 [24.4 kB]
Get:195 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-tomb.v2-dev all 0.0~git20161208.d5d1b58-3 [6840 B]
Get:196 http://172.17.0.1/private bullseye-staging/main armhf libsasl2-dev armhf 2.1.27+dfsg-1+b1 [255 kB]
Get:197 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-mgo.v2-dev all 2016.08.01-6 [316 kB]
Get:198 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-msgpack-dev all 0.5.5-1 [43.3 kB]
Get:199 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-raft-dev all 1.1.1-2 [88.5 kB]
Get:200 http://172.17.0.1/private bullseye-staging/main armhf libjs-jquery all 3.3.1~dfsg-3 [332 kB]
Get:201 http://172.17.0.1/private bullseye-staging/main armhf libjs-jquery-ui all 1.12.1+dfsg-5 [232 kB]
Get:202 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-tools armhf 1:0.0~git20191118.07fc4c7+ds-1 [28.9 MB]
Get:203 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-reflectwalk-dev all 0.0~git20170726.63d60e9-4 [7868 B]
Get:204 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-copystructure-dev all 0.0~git20161013.0.5af94ae-2 [8704 B]
Get:205 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-raftchunking-dev all 0.6.2-2 [12.3 kB]
Get:206 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-reap-dev all 0.0~git20160113.0.2d85522-3 [9334 B]
Get:207 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-sockaddr-dev all 0.0~git20170627.41949a1+ds-2 [62.7 kB]
Get:208 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-version-dev all 1.2.0-1 [13.8 kB]
Get:209 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-hcl-dev all 1.0.0-1 [58.5 kB]
Get:210 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-mapstructure-dev all 1.1.2-1 [21.1 kB]
Get:211 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-hil-dev all 0.0~git20160711.1e86c6b-1 [32.6 kB]
Get:212 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-memberlist-dev all 0.1.5-2 [74.8 kB]
Get:213 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-raft-boltdb-dev all 0.0~git20171010.6e5ba93-3 [11.1 kB]
Get:214 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-net-rpc-msgpackrpc-dev all 0.0~git20151116.0.a14192a-1 [4168 B]
Get:215 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-yamux-dev all 0.0+git20190923.df201c7-1 [22.0 kB]
Get:216 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-scada-client-dev all 0.0~git20160601.0.6e89678-2 [19.3 kB]
Get:217 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-syslog-dev all 0.0~git20150218.0.42a2b57-1 [5336 B]
Get:218 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-logutils-dev all 0.0~git20150609.0.0dc08b1-1 [8150 B]
Get:219 http://172.17.0.1/private bullseye-staging/main armhf golang-github-posener-complete-dev all 1.1+git20180108.57878c9-3 [17.6 kB]
Get:220 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-cli-dev all 1.0.0-1 [23.8 kB]
Get:221 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ryanuber-columnize-dev all 2.1.1-1 [6600 B]
Get:222 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-serf-dev all 0.8.5~ds1-1 [127 kB]
Get:223 http://172.17.0.1/private bullseye-staging/main armhf golang-github-imdario-mergo-dev all 0.3.5-1 [16.4 kB]
Get:224 http://172.17.0.1/private bullseye-staging/main armhf golang-github-inconshreveable-muxado-dev all 0.0~git20140312.0.f693c7e-2 [26.5 kB]
Get:225 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jeffail-gabs-dev all 2.1.0-2 [16.6 kB]
Get:226 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jefferai-jsonx-dev all 1.0.1-2 [4552 B]
Get:227 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-go-testing-interface-dev all 1.0.0-1 [4268 B]
Get:228 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-hashstructure-dev all 1.0.0-1 [7400 B]
Get:229 http://172.17.0.1/private bullseye-staging/main armhf golang-github-nytimes-gziphandler-dev all 1.1.1-1 [39.9 kB]
Get:230 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ryanuber-go-glob-dev all 1.0.0-2 [4588 B]
Get:231 http://172.17.0.1/private bullseye-staging/main armhf golang-github-sap-go-hdb-dev all 0.14.1-2 [61.9 kB]
Get:232 http://172.17.0.1/private bullseye-staging/main armhf golang-github-shirou-gopsutil-dev all 2.18.06-1 [89.3 kB]
Get:233 http://172.17.0.1/private bullseye-staging/main armhf golang-github-spf13-pflag-dev all 1.0.3-1 [38.0 kB]
Get:234 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-inf.v0-dev all 0.9.0-3 [14.0 kB]
Get:235 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-square-go-jose.v2-dev all 2.3.1-1 [260 kB]
Get:236 http://172.17.0.1/private bullseye-staging/main armhf mockery armhf 0.0~git20181123.e78b021-2 [1598 kB]
Get:237 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-rootcerts-dev all 0.0~git20160503.0.6bb64b3-1 [7336 B]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 158 MB in 17s (9216 kB/s)
(Reading database ... 12227 files and directories currently installed.)
Preparing to unpack .../libsystemd0_243-8+rpi1_armhf.deb ...
Unpacking libsystemd0:armhf (243-8+rpi1) over (242-7+rpi1) ...
Setting up libsystemd0:armhf (243-8+rpi1) ...
Selecting previously unselected package libbsd0:armhf.
(Reading database ... 12227 files and directories currently installed.)
Preparing to unpack .../0-libbsd0_0.10.0-1_armhf.deb ...
Unpacking libbsd0:armhf (0.10.0-1) ...
Selecting previously unselected package libtinfo5:armhf.
Preparing to unpack .../1-libtinfo5_6.1+20191019-1_armhf.deb ...
Unpacking libtinfo5:armhf (6.1+20191019-1) ...
Selecting previously unselected package bsdmainutils.
Preparing to unpack .../2-bsdmainutils_11.1.2_armhf.deb ...
Unpacking bsdmainutils (11.1.2) ...
Selecting previously unselected package libuchardet0:armhf.
Preparing to unpack .../3-libuchardet0_0.0.6-3_armhf.deb ...
Unpacking libuchardet0:armhf (0.0.6-3) ...
Selecting previously unselected package groff-base.
Preparing to unpack .../4-groff-base_1.22.4-3_armhf.deb ...
Unpacking groff-base (1.22.4-3) ...
Selecting previously unselected package libpipeline1:armhf.
Preparing to unpack .../5-libpipeline1_1.5.1-2_armhf.deb ...
Unpacking libpipeline1:armhf (1.5.1-2) ...
Preparing to unpack .../6-libseccomp2_2.4.2-2+rpi1_armhf.deb ...
Unpacking libseccomp2:armhf (2.4.2-2+rpi1) over (2.4.1-2+rpi1) ...
Setting up libseccomp2:armhf (2.4.2-2+rpi1) ...
Selecting previously unselected package man-db.
(Reading database ... 12557 files and directories currently installed.)
Preparing to unpack .../000-man-db_2.9.0-1_armhf.deb ...
Unpacking man-db (2.9.0-1) ...
Selecting previously unselected package golang-github-davecgh-go-spew-dev.
Preparing to unpack .../001-golang-github-davecgh-go-spew-dev_1.1.1-2_all.deb ...
Unpacking golang-github-davecgh-go-spew-dev (1.1.1-2) ...
Selecting previously unselected package golang-github-pmezard-go-difflib-dev.
Preparing to unpack .../002-golang-github-pmezard-go-difflib-dev_1.0.0-2_all.deb ...
Unpacking golang-github-pmezard-go-difflib-dev (1.0.0-2) ...
Selecting previously unselected package golang-github-stretchr-objx-dev.
Preparing to unpack .../003-golang-github-stretchr-objx-dev_0.1.1+git20180825.ef50b0d-1_all.deb ...
Unpacking golang-github-stretchr-objx-dev (0.1.1+git20180825.ef50b0d-1) ...
Selecting previously unselected package golang-github-kr-pty-dev.
Preparing to unpack .../004-golang-github-kr-pty-dev_1.1.6-1_all.deb ...
Unpacking golang-github-kr-pty-dev (1.1.6-1) ...
Selecting previously unselected package golang-github-kr-text-dev.
Preparing to unpack .../005-golang-github-kr-text-dev_0.1.0-1_all.deb ...
Unpacking golang-github-kr-text-dev (0.1.0-1) ...
Selecting previously unselected package golang-github-kr-pretty-dev.
Preparing to unpack .../006-golang-github-kr-pretty-dev_0.1.0-1_all.deb ...
Unpacking golang-github-kr-pretty-dev (0.1.0-1) ...
Selecting previously unselected package golang-gopkg-check.v1-dev.
Preparing to unpack .../007-golang-gopkg-check.v1-dev_0.0+git20180628.788fd78-1_all.deb ...
Unpacking golang-gopkg-check.v1-dev (0.0+git20180628.788fd78-1) ...
Selecting previously unselected package golang-gopkg-yaml.v2-dev.
Preparing to unpack .../008-golang-gopkg-yaml.v2-dev_2.2.2-1_all.deb ...
Unpacking golang-gopkg-yaml.v2-dev (2.2.2-1) ...
Selecting previously unselected package golang-github-stretchr-testify-dev.
Preparing to unpack .../009-golang-github-stretchr-testify-dev_1.4.0+ds-1_all.deb ...
Unpacking golang-github-stretchr-testify-dev (1.4.0+ds-1) ...
Selecting previously unselected package golang-golang-x-sys-dev.
Preparing to unpack .../010-golang-golang-x-sys-dev_0.0~git20190726.fc99dfb-1_all.deb ...
Unpacking golang-golang-x-sys-dev (0.0~git20190726.fc99dfb-1) ...
Selecting previously unselected package golang-github-sirupsen-logrus-dev.
Preparing to unpack .../011-golang-github-sirupsen-logrus-dev_1.4.2-1_all.deb ...
Unpacking golang-github-sirupsen-logrus-dev (1.4.2-1) ...
Selecting previously unselected package libelf1:armhf.
Preparing to unpack .../012-libelf1_0.176-1.1_armhf.deb ...
Unpacking libelf1:armhf (0.176-1.1) ...
Selecting previously unselected package libmnl0:armhf.
Preparing to unpack .../013-libmnl0_1.0.4-2_armhf.deb ...
Unpacking libmnl0:armhf (1.0.4-2) ...
Selecting previously unselected package libxtables12:armhf.
Preparing to unpack .../014-libxtables12_1.8.3-2_armhf.deb ...
Unpacking libxtables12:armhf (1.8.3-2) ...
Selecting previously unselected package iproute2.
Preparing to unpack .../015-iproute2_5.4.0-1_armhf.deb ...
Unpacking iproute2 (5.4.0-1) ...
Selecting previously unselected package libncurses6:armhf.
Preparing to unpack .../016-libncurses6_6.1+20191019-1_armhf.deb ...
Unpacking libncurses6:armhf (6.1+20191019-1) ...
Selecting previously unselected package libprocps7:armhf.
Preparing to unpack .../017-libprocps7_2%3a3.3.15-2_armhf.deb ...
Unpacking libprocps7:armhf (2:3.3.15-2) ...
Selecting previously unselected package procps.
Preparing to unpack .../018-procps_2%3a3.3.15-2_armhf.deb ...
Unpacking procps (2:3.3.15-2) ...
Selecting previously unselected package sensible-utils.
Preparing to unpack .../019-sensible-utils_0.0.12+nmu1_all.deb ...
Unpacking sensible-utils (0.0.12+nmu1) ...
Selecting previously unselected package bash-completion.
Preparing to unpack .../020-bash-completion_1%3a2.8-6_all.deb ...
Unpacking bash-completion (1:2.8-6) ...
Selecting previously unselected package libmagic-mgc.
Preparing to unpack .../021-libmagic-mgc_1%3a5.37-6_armhf.deb ...
Unpacking libmagic-mgc (1:5.37-6) ...
Selecting previously unselected package libmagic1:armhf.
Preparing to unpack .../022-libmagic1_1%3a5.37-6_armhf.deb ...
Unpacking libmagic1:armhf (1:5.37-6) ...
Selecting previously unselected package file.
Preparing to unpack .../023-file_1%3a5.37-6_armhf.deb ...
Unpacking file (1:5.37-6) ...
Selecting previously unselected package gettext-base.
Preparing to unpack .../024-gettext-base_0.19.8.1-10_armhf.deb ...
Unpacking gettext-base (0.19.8.1-10) ...
Selecting previously unselected package lsof.
Preparing to unpack .../025-lsof_4.93.2+dfsg-1_armhf.deb ...
Unpacking lsof (4.93.2+dfsg-1) ...
Selecting previously unselected package libsigsegv2:armhf.
Preparing to unpack .../026-libsigsegv2_2.12-2_armhf.deb ...
Unpacking libsigsegv2:armhf (2.12-2) ...
Selecting previously unselected package m4.
Preparing to unpack .../027-m4_1.4.18-4_armhf.deb ...
Unpacking m4 (1.4.18-4) ...
Selecting previously unselected package autoconf.
Preparing to unpack .../028-autoconf_2.69-11_all.deb ...
Unpacking autoconf (2.69-11) ...
Selecting previously unselected package autotools-dev.
Preparing to unpack .../029-autotools-dev_20180224.1_all.deb ...
Unpacking autotools-dev (20180224.1) ...
Selecting previously unselected package automake.
Preparing to unpack .../030-automake_1%3a1.16.1-4_all.deb ...
Unpacking automake (1:1.16.1-4) ...
Selecting previously unselected package autopoint.
Preparing to unpack .../031-autopoint_0.19.8.1-10_all.deb ...
Unpacking autopoint (0.19.8.1-10) ...
Selecting previously unselected package libssl1.1:armhf.
Preparing to unpack .../032-libssl1.1_1.1.1d-2_armhf.deb ...
Unpacking libssl1.1:armhf (1.1.1d-2) ...
Selecting previously unselected package openssl.
Preparing to unpack .../033-openssl_1.1.1d-2_armhf.deb ...
Unpacking openssl (1.1.1d-2) ...
Selecting previously unselected package ca-certificates.
Preparing to unpack .../034-ca-certificates_20190110_all.deb ...
Unpacking ca-certificates (20190110) ...
Selecting previously unselected package libtool.
Preparing to unpack .../035-libtool_2.4.6-11_all.deb ...
Unpacking libtool (2.4.6-11) ...
Selecting previously unselected package dh-autoreconf.
Preparing to unpack .../036-dh-autoreconf_19_all.deb ...
Unpacking dh-autoreconf (19) ...
Selecting previously unselected package libdebhelper-perl.
Preparing to unpack .../037-libdebhelper-perl_12.7.1_all.deb ...
Unpacking libdebhelper-perl (12.7.1) ...
Selecting previously unselected package libarchive-zip-perl.
Preparing to unpack .../038-libarchive-zip-perl_1.67-1_all.deb ...
Unpacking libarchive-zip-perl (1.67-1) ...
Selecting previously unselected package libsub-override-perl.
Preparing to unpack .../039-libsub-override-perl_0.09-2_all.deb ...
Unpacking libsub-override-perl (0.09-2) ...
Selecting previously unselected package libfile-stripnondeterminism-perl.
Preparing to unpack .../040-libfile-stripnondeterminism-perl_1.6.3-1_all.deb ...
Unpacking libfile-stripnondeterminism-perl (1.6.3-1) ...
Selecting previously unselected package dh-strip-nondeterminism.
Preparing to unpack .../041-dh-strip-nondeterminism_1.6.3-1_all.deb ...
Unpacking dh-strip-nondeterminism (1.6.3-1) ...
Selecting previously unselected package dwz.
Preparing to unpack .../042-dwz_0.13-4_armhf.deb ...
Unpacking dwz (0.13-4) ...
Selecting previously unselected package libglib2.0-0:armhf.
Preparing to unpack .../043-libglib2.0-0_2.62.3-2_armhf.deb ...
Unpacking libglib2.0-0:armhf (2.62.3-2) ...
Selecting previously unselected package libicu63:armhf.
Preparing to unpack .../044-libicu63_63.2-2_armhf.deb ...
Unpacking libicu63:armhf (63.2-2) ...
Selecting previously unselected package libxml2:armhf.
Preparing to unpack .../045-libxml2_2.9.4+dfsg1-8_armhf.deb ...
Unpacking libxml2:armhf (2.9.4+dfsg1-8) ...
Selecting previously unselected package libcroco3:armhf.
Preparing to unpack .../046-libcroco3_0.6.13-1_armhf.deb ...
Unpacking libcroco3:armhf (0.6.13-1) ...
Selecting previously unselected package gettext.
Preparing to unpack .../047-gettext_0.19.8.1-10_armhf.deb ...
Unpacking gettext (0.19.8.1-10) ...
Selecting previously unselected package intltool-debian.
Preparing to unpack .../048-intltool-debian_0.35.0+20060710.5_all.deb ...
Unpacking intltool-debian (0.35.0+20060710.5) ...
Selecting previously unselected package po-debconf.
Preparing to unpack .../049-po-debconf_1.0.21_all.deb ...
Unpacking po-debconf (1.0.21) ...
Selecting previously unselected package debhelper.
Preparing to unpack .../050-debhelper_12.7.1_all.deb ...
Unpacking debhelper (12.7.1) ...
Selecting previously unselected package dh-golang.
Preparing to unpack .../051-dh-golang_1.43_all.deb ...
Unpacking dh-golang (1.43) ...
Selecting previously unselected package golang-github-gogo-protobuf-dev.
Preparing to unpack .../052-golang-github-gogo-protobuf-dev_1.2.1+git20190611.dadb6258-1_all.deb ...
Unpacking golang-github-gogo-protobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package libprotobuf17:armhf.
Preparing to unpack .../053-libprotobuf17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package libprotoc17:armhf.
Preparing to unpack .../054-libprotoc17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotoc17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package protobuf-compiler.
Preparing to unpack .../055-protobuf-compiler_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking protobuf-compiler (3.6.1.3-2+rpi1) ...
Selecting previously unselected package gogoprotobuf.
Preparing to unpack .../056-gogoprotobuf_1.2.1+git20190611.dadb6258-1_armhf.deb ...
Unpacking gogoprotobuf (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package golang-1.13-src.
Preparing to unpack .../057-golang-1.13-src_1.13.4-1+rpi1_armhf.deb ...
Unpacking golang-1.13-src (1.13.4-1+rpi1) ...
Selecting previously unselected package golang-1.13-go.
Preparing to unpack .../058-golang-1.13-go_1.13.4-1+rpi1_armhf.deb ...
Unpacking golang-1.13-go (1.13.4-1+rpi1) ...
Selecting previously unselected package golang-src.
Preparing to unpack .../059-golang-src_2%3a1.13~1+b11_armhf.deb ...
Unpacking golang-src (2:1.13~1+b11) ...
Selecting previously unselected package golang-go.
Preparing to unpack .../060-golang-go_2%3a1.13~1+b11_armhf.deb ...
Unpacking golang-go (2:1.13~1+b11) ...
Selecting previously unselected package golang-any.
Preparing to unpack .../061-golang-any_2%3a1.13~1+b11_armhf.deb ...
Unpacking golang-any (2:1.13~1+b11) ...
Selecting previously unselected package golang-dbus-dev.
Preparing to unpack .../062-golang-dbus-dev_5.0.2-1_all.deb ...
Unpacking golang-dbus-dev (5.0.2-1) ...
Selecting previously unselected package golang-github-alecthomas-units-dev.
Preparing to unpack .../063-golang-github-alecthomas-units-dev_0.0~git20151022.0.2efee85-4_all.deb ...
Unpacking golang-github-alecthomas-units-dev (0.0~git20151022.0.2efee85-4) ...
Selecting previously unselected package golang-github-armon-circbuf-dev.
Preparing to unpack .../064-golang-github-armon-circbuf-dev_0.0~git20150827.0.bbbad09-2_all.deb ...
Unpacking golang-github-armon-circbuf-dev (0.0~git20150827.0.bbbad09-2) ...
Selecting previously unselected package golang-github-pkg-errors-dev.
Preparing to unpack .../065-golang-github-pkg-errors-dev_0.8.1-1_all.deb ...
Unpacking golang-github-pkg-errors-dev (0.8.1-1) ...
Selecting previously unselected package golang-github-circonus-labs-circonusllhist-dev.
Preparing to unpack .../066-golang-github-circonus-labs-circonusllhist-dev_0.0~git20160526.0.d724266-2_all.deb ...
Unpacking golang-github-circonus-labs-circonusllhist-dev (0.0~git20160526.0.d724266-2) ...
Selecting previously unselected package golang-github-hashicorp-go-cleanhttp-dev.
Preparing to unpack .../067-golang-github-hashicorp-go-cleanhttp-dev_0.5.1-1_all.deb ...
Unpacking golang-github-hashicorp-go-cleanhttp-dev (0.5.1-1) ...
Selecting previously unselected package golang-github-mattn-go-isatty-dev.
Preparing to unpack .../068-golang-github-mattn-go-isatty-dev_0.0.8-2_all.deb ...
Unpacking golang-github-mattn-go-isatty-dev (0.0.8-2) ...
Selecting previously unselected package golang-github-mattn-go-colorable-dev.
Preparing to unpack .../069-golang-github-mattn-go-colorable-dev_0.0.9-3_all.deb ...
Unpacking golang-github-mattn-go-colorable-dev (0.0.9-3) ...
Selecting previously unselected package golang-github-fatih-color-dev.
Preparing to unpack .../070-golang-github-fatih-color-dev_1.5.0-1_all.deb ...
Unpacking golang-github-fatih-color-dev (1.5.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-hclog-dev.
Preparing to unpack .../071-golang-github-hashicorp-go-hclog-dev_0.10.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-hclog-dev (0.10.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-retryablehttp-dev.
Preparing to unpack .../072-golang-github-hashicorp-go-retryablehttp-dev_0.6.4-1_all.deb ...
Unpacking golang-github-hashicorp-go-retryablehttp-dev (0.6.4-1) ...
Selecting previously unselected package golang-github-tv42-httpunix-dev.
Preparing to unpack .../073-golang-github-tv42-httpunix-dev_0.0~git20150427.b75d861-2_all.deb ...
Unpacking golang-github-tv42-httpunix-dev (0.0~git20150427.b75d861-2) ...
Selecting previously unselected package golang-github-circonus-labs-circonus-gometrics-dev.
Preparing to unpack .../074-golang-github-circonus-labs-circonus-gometrics-dev_2.3.1-2_all.deb ...
Unpacking golang-github-circonus-labs-circonus-gometrics-dev (2.3.1-2) ...
Selecting previously unselected package golang-github-datadog-datadog-go-dev.
Preparing to unpack .../075-golang-github-datadog-datadog-go-dev_2.1.0-2_all.deb ...
Unpacking golang-github-datadog-datadog-go-dev (2.1.0-2) ...
Selecting previously unselected package golang-github-hashicorp-go-uuid-dev.
Preparing to unpack .../076-golang-github-hashicorp-go-uuid-dev_1.0.1-1_all.deb ...
Unpacking golang-github-hashicorp-go-uuid-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-hashicorp-golang-lru-dev.
Preparing to unpack .../077-golang-github-hashicorp-golang-lru-dev_0.5.0-1_all.deb ...
Unpacking golang-github-hashicorp-golang-lru-dev (0.5.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-immutable-radix-dev.
Preparing to unpack .../078-golang-github-hashicorp-go-immutable-radix-dev_1.1.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-immutable-radix-dev (1.1.0-1) ...
Selecting previously unselected package golang-github-pascaldekloe-goe-dev.
Preparing to unpack .../079-golang-github-pascaldekloe-goe-dev_0.1.0-2_all.deb ...
Unpacking golang-github-pascaldekloe-goe-dev (0.1.0-2) ...
Selecting previously unselected package golang-github-beorn7-perks-dev.
Preparing to unpack .../080-golang-github-beorn7-perks-dev_0.0~git20160804.0.4c0e845-1_all.deb ...
Unpacking golang-github-beorn7-perks-dev (0.0~git20160804.0.4c0e845-1) ...
Selecting previously unselected package golang-github-cespare-xxhash-dev.
Preparing to unpack .../081-golang-github-cespare-xxhash-dev_2.1.0-1_all.deb ...
Unpacking golang-github-cespare-xxhash-dev (2.1.0-1) ...
Selecting previously unselected package golang-github-google-gofuzz-dev.
Preparing to unpack .../082-golang-github-google-gofuzz-dev_0.0~git20170612.24818f7-1_all.deb ...
Unpacking golang-github-google-gofuzz-dev (0.0~git20170612.24818f7-1) ...
Selecting previously unselected package golang-github-modern-go-concurrent-dev.
Preparing to unpack .../083-golang-github-modern-go-concurrent-dev_1.0.3-1_all.deb ...
Unpacking golang-github-modern-go-concurrent-dev (1.0.3-1) ...
Selecting previously unselected package golang-github-modern-go-reflect2-dev.
Preparing to unpack .../084-golang-github-modern-go-reflect2-dev_1.0.0-1_all.deb ...
Unpacking golang-github-modern-go-reflect2-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-json-iterator-go-dev.
Preparing to unpack .../085-golang-github-json-iterator-go-dev_1.1.4-1_all.deb ...
Unpacking golang-github-json-iterator-go-dev (1.1.4-1) ...
Selecting previously unselected package zlib1g-dev:armhf.
Preparing to unpack .../086-zlib1g-dev_1%3a1.2.11.dfsg-1_armhf.deb ...
Unpacking zlib1g-dev:armhf (1:1.2.11.dfsg-1) ...
Selecting previously unselected package libprotobuf-lite17:armhf.
Preparing to unpack .../087-libprotobuf-lite17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf-lite17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package libprotobuf-dev:armhf.
Preparing to unpack .../088-libprotobuf-dev_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf-dev:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package golang-goprotobuf-dev.
Preparing to unpack .../089-golang-goprotobuf-dev_1.3.2-2_armhf.deb ...
Unpacking golang-goprotobuf-dev (1.3.2-2) ...
Selecting previously unselected package golang-github-prometheus-client-model-dev.
Preparing to unpack .../090-golang-github-prometheus-client-model-dev_0.0.2+git20171117.99fa1f4-1_all.deb ...
Unpacking golang-github-prometheus-client-model-dev (0.0.2+git20171117.99fa1f4-1) ...
Selecting previously unselected package golang-github-dgrijalva-jwt-go-v3-dev.
Preparing to unpack .../091-golang-github-dgrijalva-jwt-go-v3-dev_3.2.0-2_all.deb ...
Unpacking golang-github-dgrijalva-jwt-go-v3-dev (3.2.0-2) ...
Selecting previously unselected package golang-github-go-logfmt-logfmt-dev.
Preparing to unpack .../092-golang-github-go-logfmt-logfmt-dev_0.3.0-1_all.deb ...
Unpacking golang-github-go-logfmt-logfmt-dev (0.3.0-1) ...
Selecting previously unselected package golang-github-go-stack-stack-dev.
Preparing to unpack .../093-golang-github-go-stack-stack-dev_1.5.2-2_all.deb ...
Unpacking golang-github-go-stack-stack-dev (1.5.2-2) ...
Selecting previously unselected package golang-golang-x-sync-dev.
Preparing to unpack .../094-golang-golang-x-sync-dev_0.0~git20190423.1122301-1_all.deb ...
Unpacking golang-golang-x-sync-dev (0.0~git20190423.1122301-1) ...
Selecting previously unselected package golang-golang-x-xerrors-dev.
Preparing to unpack .../095-golang-golang-x-xerrors-dev_0.0~git20190717.a985d34-1_all.deb ...
Unpacking golang-golang-x-xerrors-dev (0.0~git20190717.a985d34-1) ...
Selecting previously unselected package golang-golang-x-tools-dev.
Preparing to unpack .../096-golang-golang-x-tools-dev_1%3a0.0~git20191118.07fc4c7+ds-1_all.deb ...
Unpacking golang-golang-x-tools-dev (1:0.0~git20191118.07fc4c7+ds-1) ...
Selecting previously unselected package golang-golang-x-text-dev.
Preparing to unpack .../097-golang-golang-x-text-dev_0.3.2-1_all.deb ...
Unpacking golang-golang-x-text-dev (0.3.2-1) ...
Selecting previously unselected package golang-golang-x-net-dev.
Preparing to unpack .../098-golang-golang-x-net-dev_1%3a0.0+git20191112.2180aed+dfsg-1_all.deb ...
Unpacking golang-golang-x-net-dev (1:0.0+git20191112.2180aed+dfsg-1) ...
Selecting previously unselected package golang-github-opentracing-opentracing-go-dev.
Preparing to unpack .../099-golang-github-opentracing-opentracing-go-dev_1.0.2-1_all.deb ...
Unpacking golang-github-opentracing-opentracing-go-dev (1.0.2-1) ...
Selecting previously unselected package golang-golang-x-time-dev.
Preparing to unpack .../100-golang-golang-x-time-dev_0.0~git20161028.0.f51c127-2_all.deb ...
Unpacking golang-golang-x-time-dev (0.0~git20161028.0.f51c127-2) ...
Selecting previously unselected package golang-github-golang-mock-dev.
Preparing to unpack .../101-golang-github-golang-mock-dev_1.3.1-2_all.deb ...
Unpacking golang-github-golang-mock-dev (1.3.1-2) ...
Selecting previously unselected package golang-github-google-go-cmp-dev.
Preparing to unpack .../102-golang-github-google-go-cmp-dev_0.3.1-1_all.deb ...
Unpacking golang-github-google-go-cmp-dev (0.3.1-1) ...
Selecting previously unselected package golang-glog-dev.
Preparing to unpack .../103-golang-glog-dev_0.0~git20160126.23def4e-3_all.deb ...
Unpacking golang-glog-dev (0.0~git20160126.23def4e-3) ...
Selecting previously unselected package golang-golang-x-oauth2-dev.
Preparing to unpack .../104-golang-golang-x-oauth2-dev_0.0~git20190604.0f29369-2_all.deb ...
Unpacking golang-golang-x-oauth2-dev (0.0~git20190604.0f29369-2) ...
Selecting previously unselected package golang-google-cloud-compute-metadata-dev.
Preparing to unpack .../105-golang-google-cloud-compute-metadata-dev_0.43.0-1_all.deb ...
Unpacking golang-google-cloud-compute-metadata-dev (0.43.0-1) ...
Selecting previously unselected package golang-golang-x-oauth2-google-dev.
Preparing to unpack .../106-golang-golang-x-oauth2-google-dev_0.0~git20190604.0f29369-2_all.deb ...
Unpacking golang-golang-x-oauth2-google-dev (0.0~git20190604.0f29369-2) ...
Selecting previously unselected package golang-google-genproto-dev.
Preparing to unpack .../107-golang-google-genproto-dev_0.0~git20190801.fa694d8-2_all.deb ...
Unpacking golang-google-genproto-dev (0.0~git20190801.fa694d8-2) ...
Selecting previously unselected package golang-google-grpc-dev.
Preparing to unpack .../108-golang-google-grpc-dev_1.22.1-1_all.deb ...
Unpacking golang-google-grpc-dev (1.22.1-1) ...
Selecting previously unselected package golang-github-go-kit-kit-dev.
Preparing to unpack .../109-golang-github-go-kit-kit-dev_0.6.0-2_all.deb ...
Unpacking golang-github-go-kit-kit-dev (0.6.0-2) ...
Selecting previously unselected package golang-github-julienschmidt-httprouter-dev.
Preparing to unpack .../110-golang-github-julienschmidt-httprouter-dev_1.1-5_all.deb ...
Unpacking golang-github-julienschmidt-httprouter-dev (1.1-5) ...
Selecting previously unselected package golang-github-jpillora-backoff-dev.
Preparing to unpack .../111-golang-github-jpillora-backoff-dev_1.0.0-1_all.deb ...
Unpacking golang-github-jpillora-backoff-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mwitkow-go-conntrack-dev.
Preparing to unpack .../112-golang-github-mwitkow-go-conntrack-dev_0.0~git20190716.2f06839-1_all.deb ...
Unpacking golang-github-mwitkow-go-conntrack-dev (0.0~git20190716.2f06839-1) ...
Selecting previously unselected package golang-gopkg-alecthomas-kingpin.v2-dev.
Preparing to unpack .../113-golang-gopkg-alecthomas-kingpin.v2-dev_2.2.6-1_all.deb ...
Unpacking golang-gopkg-alecthomas-kingpin.v2-dev (2.2.6-1) ...
Selecting previously unselected package golang-protobuf-extensions-dev.
Preparing to unpack .../114-golang-protobuf-extensions-dev_1.0.1-1_all.deb ...
Unpacking golang-protobuf-extensions-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-prometheus-common-dev.
Preparing to unpack .../115-golang-github-prometheus-common-dev_0.7.0-1_all.deb ...
Unpacking golang-github-prometheus-common-dev (0.7.0-1) ...
Selecting previously unselected package golang-procfs-dev.
Preparing to unpack .../116-golang-procfs-dev_0.0.3-1_all.deb ...
Unpacking golang-procfs-dev (0.0.3-1) ...
Selecting previously unselected package golang-github-prometheus-client-golang-dev.
Preparing to unpack .../117-golang-github-prometheus-client-golang-dev_1.2.1-3_all.deb ...
Unpacking golang-github-prometheus-client-golang-dev (1.2.1-3) ...
Selecting previously unselected package golang-github-armon-go-metrics-dev.
Preparing to unpack .../118-golang-github-armon-go-metrics-dev_0.0~git20190430.ec5e00d-1_all.deb ...
Unpacking golang-github-armon-go-metrics-dev (0.0~git20190430.ec5e00d-1) ...
Selecting previously unselected package golang-github-armon-go-radix-dev.
Preparing to unpack .../119-golang-github-armon-go-radix-dev_1.0.0-1_all.deb ...
Unpacking golang-github-armon-go-radix-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-asaskevich-govalidator-dev.
Preparing to unpack .../120-golang-github-asaskevich-govalidator-dev_9+git20180720.0.f9ffefc3-1_all.deb ...
Unpacking golang-github-asaskevich-govalidator-dev (9+git20180720.0.f9ffefc3-1) ...
Selecting previously unselected package golang-github-go-ini-ini-dev.
Preparing to unpack .../121-golang-github-go-ini-ini-dev_1.32.0-2_all.deb ...
Unpacking golang-github-go-ini-ini-dev (1.32.0-2) ...
Selecting previously unselected package golang-github-jmespath-go-jmespath-dev.
Preparing to unpack .../122-golang-github-jmespath-go-jmespath-dev_0.2.2-3_all.deb ...
Unpacking golang-github-jmespath-go-jmespath-dev (0.2.2-3) ...
Selecting previously unselected package golang-github-aws-aws-sdk-go-dev.
Preparing to unpack .../123-golang-github-aws-aws-sdk-go-dev_1.21.6+dfsg-2_all.deb ...
Unpacking golang-github-aws-aws-sdk-go-dev (1.21.6+dfsg-2) ...
Selecting previously unselected package golang-github-dgrijalva-jwt-go-dev.
Preparing to unpack .../124-golang-github-dgrijalva-jwt-go-dev_3.2.0-1_all.deb ...
Unpacking golang-github-dgrijalva-jwt-go-dev (3.2.0-1) ...
Selecting previously unselected package golang-github-dimchansky-utfbom-dev.
Preparing to unpack .../125-golang-github-dimchansky-utfbom-dev_0.0~git20170328.6c6132f-1_all.deb ...
Unpacking golang-github-dimchansky-utfbom-dev (0.0~git20170328.6c6132f-1) ...
Selecting previously unselected package golang-github-mitchellh-go-homedir-dev.
Preparing to unpack .../126-golang-github-mitchellh-go-homedir-dev_1.1.0-1_all.deb ...
Unpacking golang-github-mitchellh-go-homedir-dev (1.1.0-1) ...
Selecting previously unselected package golang-golang-x-crypto-dev.
Preparing to unpack .../127-golang-golang-x-crypto-dev_1%3a0.0~git20190701.4def268-2_all.deb ...
Unpacking golang-golang-x-crypto-dev (1:0.0~git20190701.4def268-2) ...
Selecting previously unselected package golang-github-azure-go-autorest-dev.
Preparing to unpack .../128-golang-github-azure-go-autorest-dev_10.15.5-1_all.deb ...
Unpacking golang-github-azure-go-autorest-dev (10.15.5-1) ...
Selecting previously unselected package golang-github-bgentry-speakeasy-dev.
Preparing to unpack .../129-golang-github-bgentry-speakeasy-dev_0.1.0-1_all.deb ...
Unpacking golang-github-bgentry-speakeasy-dev (0.1.0-1) ...
Selecting previously unselected package golang-github-boltdb-bolt-dev.
Preparing to unpack .../130-golang-github-boltdb-bolt-dev_1.3.1-6_all.deb ...
Unpacking golang-github-boltdb-bolt-dev (1.3.1-6) ...
Selecting previously unselected package golang-github-bradfitz-gomemcache-dev.
Preparing to unpack .../131-golang-github-bradfitz-gomemcache-dev_0.0~git20141109-3_all.deb ...
Unpacking golang-github-bradfitz-gomemcache-dev (0.0~git20141109-3) ...
Selecting previously unselected package golang-github-coreos-pkg-dev.
Preparing to unpack .../132-golang-github-coreos-pkg-dev_4-2_all.deb ...
Unpacking golang-github-coreos-pkg-dev (4-2) ...
Selecting previously unselected package libsystemd-dev:armhf.
Preparing to unpack .../133-libsystemd-dev_243-8+rpi1_armhf.deb ...
Unpacking libsystemd-dev:armhf (243-8+rpi1) ...
Selecting previously unselected package pkg-config.
Preparing to unpack .../134-pkg-config_0.29-6_armhf.deb ...
Unpacking pkg-config (0.29-6) ...
Selecting previously unselected package golang-github-coreos-go-systemd-dev.
Preparing to unpack .../135-golang-github-coreos-go-systemd-dev_20-1_all.deb ...
Unpacking golang-github-coreos-go-systemd-dev (20-1) ...
Selecting previously unselected package golang-github-cyphar-filepath-securejoin-dev.
Preparing to unpack .../136-golang-github-cyphar-filepath-securejoin-dev_0.2.2-1_all.deb ...
Unpacking golang-github-cyphar-filepath-securejoin-dev (0.2.2-1) ...
Selecting previously unselected package golang-github-google-go-querystring-dev.
Preparing to unpack .../137-golang-github-google-go-querystring-dev_1.0.0-1_all.deb ...
Unpacking golang-github-google-go-querystring-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-tent-http-link-go-dev.
Preparing to unpack .../138-golang-github-tent-http-link-go-dev_0.0~git20130702.0.ac974c6-6_all.deb ...
Unpacking golang-github-tent-http-link-go-dev (0.0~git20130702.0.ac974c6-6) ...
Selecting previously unselected package golang-github-digitalocean-godo-dev.
Preparing to unpack .../139-golang-github-digitalocean-godo-dev_1.1.0-1_all.deb ...
Unpacking golang-github-digitalocean-godo-dev (1.1.0-1) ...
Selecting previously unselected package golang-github-docker-go-units-dev.
Preparing to unpack .../140-golang-github-docker-go-units-dev_0.4.0-1_all.deb ...
Unpacking golang-github-docker-go-units-dev (0.4.0-1) ...
Selecting previously unselected package golang-github-opencontainers-selinux-dev.
Preparing to unpack .../141-golang-github-opencontainers-selinux-dev_1.3.0-2_all.deb ...
Unpacking golang-github-opencontainers-selinux-dev (1.3.0-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonpointer-dev.
Preparing to unpack .../142-golang-github-xeipuuv-gojsonpointer-dev_0.0~git20151027.0.e0fe6f6-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonpointer-dev (0.0~git20151027.0.e0fe6f6-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonreference-dev.
Preparing to unpack .../143-golang-github-xeipuuv-gojsonreference-dev_0.0~git20150808.0.e02fc20-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonreference-dev (0.0~git20150808.0.e02fc20-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonschema-dev.
Preparing to unpack .../144-golang-github-xeipuuv-gojsonschema-dev_0.0~git20170210.0.6b67b3f-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonschema-dev (0.0~git20170210.0.6b67b3f-2) ...
Selecting previously unselected package golang-github-opencontainers-specs-dev.
Preparing to unpack .../145-golang-github-opencontainers-specs-dev_1.0.1+git20190408.a1b50f6-1_all.deb ...
Unpacking golang-github-opencontainers-specs-dev (1.0.1+git20190408.a1b50f6-1) ...
Selecting previously unselected package libseccomp-dev:armhf.
Preparing to unpack .../146-libseccomp-dev_2.4.2-2+rpi1_armhf.deb ...
Unpacking libseccomp-dev:armhf (2.4.2-2+rpi1) ...
Selecting previously unselected package golang-github-seccomp-libseccomp-golang-dev.
Preparing to unpack .../147-golang-github-seccomp-libseccomp-golang-dev_0.9.1-1_all.deb ...
Unpacking golang-github-seccomp-libseccomp-golang-dev (0.9.1-1) ...
Selecting previously unselected package golang-github-urfave-cli-dev.
Preparing to unpack .../148-golang-github-urfave-cli-dev_1.20.0-1_all.deb ...
Unpacking golang-github-urfave-cli-dev (1.20.0-1) ...
Selecting previously unselected package golang-github-vishvananda-netns-dev.
Preparing to unpack .../149-golang-github-vishvananda-netns-dev_0.0~git20170707.0.86bef33-1_all.deb ...
Unpacking golang-github-vishvananda-netns-dev (0.0~git20170707.0.86bef33-1) ...
Selecting previously unselected package golang-github-vishvananda-netlink-dev.
Preparing to unpack .../150-golang-github-vishvananda-netlink-dev_1.0.0+git20181030.023a6da-1_all.deb ...
Unpacking golang-github-vishvananda-netlink-dev (1.0.0+git20181030.023a6da-1) ...
Selecting previously unselected package golang-gocapability-dev.
Preparing to unpack .../151-golang-gocapability-dev_0.0+git20180916.d983527-1_all.deb ...
Unpacking golang-gocapability-dev (0.0+git20180916.d983527-1) ...
Selecting previously unselected package golang-github-opencontainers-runc-dev.
Preparing to unpack .../152-golang-github-opencontainers-runc-dev_1.0.0~rc9+dfsg1-1+rpi1_all.deb ...
Unpacking golang-github-opencontainers-runc-dev (1.0.0~rc9+dfsg1-1+rpi1) ...
Selecting previously unselected package golang-github-docker-go-connections-dev.
Preparing to unpack .../153-golang-github-docker-go-connections-dev_0.4.0-1_all.deb ...
Unpacking golang-github-docker-go-connections-dev (0.4.0-1) ...
Selecting previously unselected package golang-github-elazarl-go-bindata-assetfs-dev.
Preparing to unpack .../154-golang-github-elazarl-go-bindata-assetfs-dev_1.0.0-1_all.deb ...
Unpacking golang-github-elazarl-go-bindata-assetfs-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-garyburd-redigo-dev.
Preparing to unpack .../155-golang-github-garyburd-redigo-dev_0.0~git20150901.0.d8dbe4d-2_all.deb ...
Unpacking golang-github-garyburd-redigo-dev (0.0~git20150901.0.d8dbe4d-2) ...
Selecting previously unselected package golang-github-ghodss-yaml-dev.
Preparing to unpack .../156-golang-github-ghodss-yaml-dev_1.0.0-1_all.deb ...
Unpacking golang-github-ghodss-yaml-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-go-test-deep-dev.
Preparing to unpack .../157-golang-github-go-test-deep-dev_1.0.3-1_all.deb ...
Unpacking golang-github-go-test-deep-dev (1.0.3-1) ...
Selecting previously unselected package golang-gogoprotobuf-dev.
Preparing to unpack .../158-golang-gogoprotobuf-dev_1.2.1+git20190611.dadb6258-1_all.deb ...
Unpacking golang-gogoprotobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package golang-github-gogo-googleapis-dev.
Preparing to unpack .../159-golang-github-gogo-googleapis-dev_1.2.0-1_all.deb ...
Unpacking golang-github-gogo-googleapis-dev (1.2.0-1) ...
Selecting previously unselected package golang-github-golang-snappy-dev.
Preparing to unpack .../160-golang-github-golang-snappy-dev_0.0+git20160529.d9eb7a3-3_all.deb ...
Unpacking golang-github-golang-snappy-dev (0.0+git20160529.d9eb7a3-3) ...
Selecting previously unselected package golang-github-google-btree-dev.
Preparing to unpack .../161-golang-github-google-btree-dev_1.0.0-1_all.deb ...
Unpacking golang-github-google-btree-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-docopt-docopt-go-dev.
Preparing to unpack .../162-golang-github-docopt-docopt-go-dev_0.6.2+git20160216.0.784ddc5-1_all.deb ...
Unpacking golang-github-docopt-docopt-go-dev (0.6.2+git20160216.0.784ddc5-1) ...
Selecting previously unselected package golang-github-googleapis-gnostic-dev.
Preparing to unpack .../163-golang-github-googleapis-gnostic-dev_0.2.0-1_all.deb ...
Unpacking golang-github-googleapis-gnostic-dev (0.2.0-1) ...
Selecting previously unselected package golang-github-peterbourgon-diskv-dev.
Preparing to unpack .../164-golang-github-peterbourgon-diskv-dev_3.0.0-1_all.deb ...
Unpacking golang-github-peterbourgon-diskv-dev (3.0.0-1) ...
Selecting previously unselected package golang-gomega-dev.
Preparing to unpack .../165-golang-gomega-dev_1.0+git20160910.d59fa0a-1_all.deb ...
Unpacking golang-gomega-dev (1.0+git20160910.d59fa0a-1) ...
Selecting previously unselected package golang-ginkgo-dev.
Preparing to unpack .../166-golang-ginkgo-dev_1.2.0+git20161006.acfa16a-1_armhf.deb ...
Unpacking golang-ginkgo-dev (1.2.0+git20161006.acfa16a-1) ...
Selecting previously unselected package golang-github-syndtr-goleveldb-dev.
Preparing to unpack .../167-golang-github-syndtr-goleveldb-dev_0.0~git20170725.0.b89cc31-2_all.deb ...
Unpacking golang-github-syndtr-goleveldb-dev (0.0~git20170725.0.b89cc31-2) ...
Selecting previously unselected package golang-github-gregjones-httpcache-dev.
Preparing to unpack .../168-golang-github-gregjones-httpcache-dev_0.0~git20180305.9cad4c3-1_all.deb ...
Unpacking golang-github-gregjones-httpcache-dev (0.0~git20180305.9cad4c3-1) ...
Selecting previously unselected package golang-github-hashicorp-errwrap-dev.
Preparing to unpack .../169-golang-github-hashicorp-errwrap-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-errwrap-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-checkpoint-dev.
Preparing to unpack .../170-golang-github-hashicorp-go-checkpoint-dev_0.0~git20171009.1545e56-2_all.deb ...
Unpacking golang-github-hashicorp-go-checkpoint-dev (0.0~git20171009.1545e56-2) ...
Selecting previously unselected package golang-github-denverdino-aliyungo-dev.
Preparing to unpack .../171-golang-github-denverdino-aliyungo-dev_0.0~git20180921.13fa8aa-2_all.deb ...
Unpacking golang-github-denverdino-aliyungo-dev (0.0~git20180921.13fa8aa-2) ...
Selecting previously unselected package golang-github-gophercloud-gophercloud-dev.
Preparing to unpack .../172-golang-github-gophercloud-gophercloud-dev_0.6.0-1_all.deb ...
Unpacking golang-github-gophercloud-gophercloud-dev (0.6.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-multierror-dev.
Preparing to unpack .../173-golang-github-hashicorp-go-multierror-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-multierror-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-miekg-dns-dev.
Preparing to unpack .../174-golang-github-miekg-dns-dev_1.0.4+ds-1_all.deb ...
Unpacking golang-github-miekg-dns-dev (1.0.4+ds-1) ...
Selecting previously unselected package golang-github-hashicorp-mdns-dev.
Preparing to unpack .../175-golang-github-hashicorp-mdns-dev_1.0.1-1_all.deb ...
Unpacking golang-github-hashicorp-mdns-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-packethost-packngo-dev.
Preparing to unpack .../176-golang-github-packethost-packngo-dev_0.2.0-2_all.deb ...
Unpacking golang-github-packethost-packngo-dev (0.2.0-2) ...
Selecting previously unselected package golang-github-vmware-govmomi-dev.
Preparing to unpack .../177-golang-github-vmware-govmomi-dev_0.15.0-1_all.deb ...
Unpacking golang-github-vmware-govmomi-dev (0.15.0-1) ...
Selecting previously unselected package golang-go.opencensus-dev.
Preparing to unpack .../178-golang-go.opencensus-dev_0.22.0-1_all.deb ...
Unpacking golang-go.opencensus-dev (0.22.0-1) ...
Selecting previously unselected package golang-google-api-dev.
Preparing to unpack .../179-golang-google-api-dev_0.7.0-2_all.deb ...
Unpacking golang-google-api-dev (0.7.0-2) ...
Selecting previously unselected package golang-github-hashicorp-go-discover-dev.
Preparing to unpack .../180-golang-github-hashicorp-go-discover-dev_0.0+git20190905.34a6505-2_all.deb ...
Unpacking golang-github-hashicorp-go-discover-dev (0.0+git20190905.34a6505-2) ...
Selecting previously unselected package golang-github-hashicorp-go-memdb-dev.
Preparing to unpack .../181-golang-github-hashicorp-go-memdb-dev_0.0~git20180224.1289e7ff-1_all.deb ...
Unpacking golang-github-hashicorp-go-memdb-dev (0.0~git20180224.1289e7ff-1) ...
Selecting previously unselected package golang-github-ugorji-go-msgpack-dev.
Preparing to unpack .../182-golang-github-ugorji-go-msgpack-dev_0.0~git20130605.792643-5_all.deb ...
Unpacking golang-github-ugorji-go-msgpack-dev (0.0~git20130605.792643-5) ...
Selecting previously unselected package golang-github-ugorji-go-codec-dev.
Preparing to unpack .../183-golang-github-ugorji-go-codec-dev_1.1.7-1_all.deb ...
Unpacking golang-github-ugorji-go-codec-dev (1.1.7-1) ...
Selecting previously unselected package golang-gopkg-vmihailenco-msgpack.v2-dev.
Preparing to unpack .../184-golang-gopkg-vmihailenco-msgpack.v2-dev_3.3.3-1_all.deb ...
Unpacking golang-gopkg-vmihailenco-msgpack.v2-dev (3.3.3-1) ...
Selecting previously unselected package golang-gopkg-tomb.v2-dev.
Preparing to unpack .../185-golang-gopkg-tomb.v2-dev_0.0~git20161208.d5d1b58-3_all.deb ...
Unpacking golang-gopkg-tomb.v2-dev (0.0~git20161208.d5d1b58-3) ...
Selecting previously unselected package libsasl2-dev.
Preparing to unpack .../186-libsasl2-dev_2.1.27+dfsg-1+b1_armhf.deb ...
Unpacking libsasl2-dev (2.1.27+dfsg-1+b1) ...
Selecting previously unselected package golang-gopkg-mgo.v2-dev.
Preparing to unpack .../187-golang-gopkg-mgo.v2-dev_2016.08.01-6_all.deb ...
Unpacking golang-gopkg-mgo.v2-dev (2016.08.01-6) ...
Selecting previously unselected package golang-github-hashicorp-go-msgpack-dev.
Preparing to unpack .../188-golang-github-hashicorp-go-msgpack-dev_0.5.5-1_all.deb ...
Unpacking golang-github-hashicorp-go-msgpack-dev (0.5.5-1) ...
Selecting previously unselected package golang-github-hashicorp-raft-dev.
Preparing to unpack .../189-golang-github-hashicorp-raft-dev_1.1.1-2_all.deb ...
Unpacking golang-github-hashicorp-raft-dev (1.1.1-2) ...
Selecting previously unselected package libjs-jquery.
Preparing to unpack .../190-libjs-jquery_3.3.1~dfsg-3_all.deb ...
Unpacking libjs-jquery (3.3.1~dfsg-3) ...
Selecting previously unselected package libjs-jquery-ui.
Preparing to unpack .../191-libjs-jquery-ui_1.12.1+dfsg-5_all.deb ...
Unpacking libjs-jquery-ui (1.12.1+dfsg-5) ...
Selecting previously unselected package golang-golang-x-tools.
Preparing to unpack .../192-golang-golang-x-tools_1%3a0.0~git20191118.07fc4c7+ds-1_armhf.deb ...
Unpacking golang-golang-x-tools (1:0.0~git20191118.07fc4c7+ds-1) ...
Selecting previously unselected package golang-github-mitchellh-reflectwalk-dev.
Preparing to unpack .../193-golang-github-mitchellh-reflectwalk-dev_0.0~git20170726.63d60e9-4_all.deb ...
Unpacking golang-github-mitchellh-reflectwalk-dev (0.0~git20170726.63d60e9-4) ...
Selecting previously unselected package golang-github-mitchellh-copystructure-dev.
Preparing to unpack .../194-golang-github-mitchellh-copystructure-dev_0.0~git20161013.0.5af94ae-2_all.deb ...
Unpacking golang-github-mitchellh-copystructure-dev (0.0~git20161013.0.5af94ae-2) ...
Selecting previously unselected package golang-github-hashicorp-go-raftchunking-dev.
Preparing to unpack .../195-golang-github-hashicorp-go-raftchunking-dev_0.6.2-2_all.deb ...
Unpacking golang-github-hashicorp-go-raftchunking-dev (0.6.2-2) ...
Selecting previously unselected package golang-github-hashicorp-go-reap-dev.
Preparing to unpack .../196-golang-github-hashicorp-go-reap-dev_0.0~git20160113.0.2d85522-3_all.deb ...
Unpacking golang-github-hashicorp-go-reap-dev (0.0~git20160113.0.2d85522-3) ...
Selecting previously unselected package golang-github-hashicorp-go-sockaddr-dev.
Preparing to unpack .../197-golang-github-hashicorp-go-sockaddr-dev_0.0~git20170627.41949a1+ds-2_all.deb ...
Unpacking golang-github-hashicorp-go-sockaddr-dev (0.0~git20170627.41949a1+ds-2) ...
Selecting previously unselected package golang-github-hashicorp-go-version-dev.
Preparing to unpack .../198-golang-github-hashicorp-go-version-dev_1.2.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-version-dev (1.2.0-1) ...
Selecting previously unselected package golang-github-hashicorp-hcl-dev.
Preparing to unpack .../199-golang-github-hashicorp-hcl-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-hcl-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mitchellh-mapstructure-dev.
Preparing to unpack .../200-golang-github-mitchellh-mapstructure-dev_1.1.2-1_all.deb ...
Unpacking golang-github-mitchellh-mapstructure-dev (1.1.2-1) ...
Selecting previously unselected package golang-github-hashicorp-hil-dev.
Preparing to unpack .../201-golang-github-hashicorp-hil-dev_0.0~git20160711.1e86c6b-1_all.deb ...
Unpacking golang-github-hashicorp-hil-dev (0.0~git20160711.1e86c6b-1) ...
Selecting previously unselected package golang-github-hashicorp-memberlist-dev.
Preparing to unpack .../202-golang-github-hashicorp-memberlist-dev_0.1.5-2_all.deb ...
Unpacking golang-github-hashicorp-memberlist-dev (0.1.5-2) ...
Selecting previously unselected package golang-github-hashicorp-raft-boltdb-dev.
Preparing to unpack .../203-golang-github-hashicorp-raft-boltdb-dev_0.0~git20171010.6e5ba93-3_all.deb ...
Unpacking golang-github-hashicorp-raft-boltdb-dev (0.0~git20171010.6e5ba93-3) ...
Selecting previously unselected package golang-github-hashicorp-net-rpc-msgpackrpc-dev.
Preparing to unpack .../204-golang-github-hashicorp-net-rpc-msgpackrpc-dev_0.0~git20151116.0.a14192a-1_all.deb ...
Unpacking golang-github-hashicorp-net-rpc-msgpackrpc-dev (0.0~git20151116.0.a14192a-1) ...
Selecting previously unselected package golang-github-hashicorp-yamux-dev.
Preparing to unpack .../205-golang-github-hashicorp-yamux-dev_0.0+git20190923.df201c7-1_all.deb ...
Unpacking golang-github-hashicorp-yamux-dev (0.0+git20190923.df201c7-1) ...
Selecting previously unselected package golang-github-hashicorp-scada-client-dev.
Preparing to unpack .../206-golang-github-hashicorp-scada-client-dev_0.0~git20160601.0.6e89678-2_all.deb ...
Unpacking golang-github-hashicorp-scada-client-dev (0.0~git20160601.0.6e89678-2) ...
Selecting previously unselected package golang-github-hashicorp-go-syslog-dev.
Preparing to unpack .../207-golang-github-hashicorp-go-syslog-dev_0.0~git20150218.0.42a2b57-1_all.deb ...
Unpacking golang-github-hashicorp-go-syslog-dev (0.0~git20150218.0.42a2b57-1) ...
Selecting previously unselected package golang-github-hashicorp-logutils-dev.
Preparing to unpack .../208-golang-github-hashicorp-logutils-dev_0.0~git20150609.0.0dc08b1-1_all.deb ...
Unpacking golang-github-hashicorp-logutils-dev (0.0~git20150609.0.0dc08b1-1) ...
Selecting previously unselected package golang-github-posener-complete-dev.
Preparing to unpack .../209-golang-github-posener-complete-dev_1.1+git20180108.57878c9-3_all.deb ...
Unpacking golang-github-posener-complete-dev (1.1+git20180108.57878c9-3) ...
Selecting previously unselected package golang-github-mitchellh-cli-dev.
Preparing to unpack .../210-golang-github-mitchellh-cli-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-cli-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-ryanuber-columnize-dev.
Preparing to unpack .../211-golang-github-ryanuber-columnize-dev_2.1.1-1_all.deb ...
Unpacking golang-github-ryanuber-columnize-dev (2.1.1-1) ...
Selecting previously unselected package golang-github-hashicorp-serf-dev.
Preparing to unpack .../212-golang-github-hashicorp-serf-dev_0.8.5~ds1-1_all.deb ...
Unpacking golang-github-hashicorp-serf-dev (0.8.5~ds1-1) ...
Selecting previously unselected package golang-github-imdario-mergo-dev.
Preparing to unpack .../213-golang-github-imdario-mergo-dev_0.3.5-1_all.deb ...
Unpacking golang-github-imdario-mergo-dev (0.3.5-1) ...
Selecting previously unselected package golang-github-inconshreveable-muxado-dev.
Preparing to unpack .../214-golang-github-inconshreveable-muxado-dev_0.0~git20140312.0.f693c7e-2_all.deb ...
Unpacking golang-github-inconshreveable-muxado-dev (0.0~git20140312.0.f693c7e-2) ...
Selecting previously unselected package golang-github-jeffail-gabs-dev.
Preparing to unpack .../215-golang-github-jeffail-gabs-dev_2.1.0-2_all.deb ...
Unpacking golang-github-jeffail-gabs-dev (2.1.0-2) ...
Selecting previously unselected package golang-github-jefferai-jsonx-dev.
Preparing to unpack .../216-golang-github-jefferai-jsonx-dev_1.0.1-2_all.deb ...
Unpacking golang-github-jefferai-jsonx-dev (1.0.1-2) ...
Selecting previously unselected package golang-github-mitchellh-go-testing-interface-dev.
Preparing to unpack .../217-golang-github-mitchellh-go-testing-interface-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-go-testing-interface-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mitchellh-hashstructure-dev.
Preparing to unpack .../218-golang-github-mitchellh-hashstructure-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-hashstructure-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-nytimes-gziphandler-dev.
Preparing to unpack .../219-golang-github-nytimes-gziphandler-dev_1.1.1-1_all.deb ...
Unpacking golang-github-nytimes-gziphandler-dev (1.1.1-1) ...
Selecting previously unselected package golang-github-ryanuber-go-glob-dev.
Preparing to unpack .../220-golang-github-ryanuber-go-glob-dev_1.0.0-2_all.deb ...
Unpacking golang-github-ryanuber-go-glob-dev (1.0.0-2) ...
Selecting previously unselected package golang-github-sap-go-hdb-dev.
Preparing to unpack .../221-golang-github-sap-go-hdb-dev_0.14.1-2_all.deb ...
Unpacking golang-github-sap-go-hdb-dev (0.14.1-2) ...
Selecting previously unselected package golang-github-shirou-gopsutil-dev.
Preparing to unpack .../222-golang-github-shirou-gopsutil-dev_2.18.06-1_all.deb ...
Unpacking golang-github-shirou-gopsutil-dev (2.18.06-1) ...
Selecting previously unselected package golang-github-spf13-pflag-dev.
Preparing to unpack .../223-golang-github-spf13-pflag-dev_1.0.3-1_all.deb ...
Unpacking golang-github-spf13-pflag-dev (1.0.3-1) ...
Selecting previously unselected package golang-gopkg-inf.v0-dev.
Preparing to unpack .../224-golang-gopkg-inf.v0-dev_0.9.0-3_all.deb ...
Unpacking golang-gopkg-inf.v0-dev (0.9.0-3) ...
Selecting previously unselected package golang-gopkg-square-go-jose.v2-dev.
Preparing to unpack .../225-golang-gopkg-square-go-jose.v2-dev_2.3.1-1_all.deb ...
Unpacking golang-gopkg-square-go-jose.v2-dev (2.3.1-1) ...
Selecting previously unselected package mockery.
Preparing to unpack .../226-mockery_0.0~git20181123.e78b021-2_armhf.deb ...
Unpacking mockery (0.0~git20181123.e78b021-2) ...
Selecting previously unselected package golang-github-hashicorp-go-rootcerts-dev.
Preparing to unpack .../227-golang-github-hashicorp-go-rootcerts-dev_0.0~git20160503.0.6bb64b3-1_all.deb ...
Unpacking golang-github-hashicorp-go-rootcerts-dev (0.0~git20160503.0.6bb64b3-1) ...
Selecting previously unselected package sbuild-build-depends-consul-dummy.
Preparing to unpack .../228-sbuild-build-depends-consul-dummy_0.invalid.0_armhf.deb ...
Unpacking sbuild-build-depends-consul-dummy (0.invalid.0) ...
Setting up golang-github-xeipuuv-gojsonpointer-dev (0.0~git20151027.0.e0fe6f6-2) ...
Setting up golang-github-dimchansky-utfbom-dev (0.0~git20170328.6c6132f-1) ...
Setting up golang-github-dgrijalva-jwt-go-v3-dev (3.2.0-2) ...
Setting up libpipeline1:armhf (1.5.1-2) ...
Setting up golang-github-google-go-cmp-dev (0.3.1-1) ...
Setting up golang-github-ryanuber-go-glob-dev (1.0.0-2) ...
Setting up golang-github-go-ini-ini-dev (1.32.0-2) ...
Setting up golang-github-hashicorp-go-uuid-dev (1.0.1-1) ...
Setting up golang-1.13-src (1.13.4-1+rpi1) ...
Setting up libseccomp-dev:armhf (2.4.2-2+rpi1) ...
Setting up golang-github-mitchellh-go-homedir-dev (1.1.0-1) ...
Setting up golang-github-google-go-querystring-dev (1.0.0-1) ...
Setting up golang-github-mitchellh-mapstructure-dev (1.1.2-1) ...
Setting up golang-dbus-dev (5.0.2-1) ...
Setting up golang-github-gogo-protobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Setting up golang-github-golang-mock-dev (1.3.1-2) ...
Setting up golang-github-stretchr-objx-dev (0.1.1+git20180825.ef50b0d-1) ...
Setting up golang-github-mitchellh-hashstructure-dev (1.0.0-1) ...
Setting up libmagic-mgc (1:5.37-6) ...
Setting up golang-github-pkg-errors-dev (0.8.1-1) ...
Setting up golang-github-hashicorp-golang-lru-dev (0.5.0-1) ...
Setting up golang-github-google-gofuzz-dev (0.0~git20170612.24818f7-1) ...
Setting up golang-github-inconshreveable-muxado-dev (0.0~git20140312.0.f693c7e-2) ...
Setting up libarchive-zip-perl (1.67-1) ...
Setting up libglib2.0-0:armhf (2.62.3-2) ...
No schema files found: doing nothing.
Setting up libprotobuf-lite17:armhf (3.6.1.3-2+rpi1) ...
Setting up libssl1.1:armhf (1.1.1d-2) ...
Setting up golang-github-ryanuber-columnize-dev (2.1.1-1) ...
Setting up libprocps7:armhf (2:3.3.15-2) ...
Setting up libdebhelper-perl (12.7.1) ...
Setting up golang-golang-x-sys-dev (0.0~git20190726.fc99dfb-1) ...
Setting up golang-github-tent-http-link-go-dev (0.0~git20130702.0.ac974c6-6) ...
Setting up libmagic1:armhf (1:5.37-6) ...
Setting up golang-github-hashicorp-go-syslog-dev (0.0~git20150218.0.42a2b57-1) ...
Setting up golang-github-golang-snappy-dev (0.0+git20160529.d9eb7a3-3) ...
Setting up golang-github-pmezard-go-difflib-dev (1.0.0-2) ...
Setting up golang-github-modern-go-concurrent-dev (1.0.3-1) ...
Setting up gettext-base (0.19.8.1-10) ...
Setting up golang-github-circonus-labs-circonusllhist-dev (0.0~git20160526.0.d724266-2) ...
Setting up golang-github-bradfitz-gomemcache-dev (0.0~git20141109-3) ...
Setting up mockery (0.0~git20181123.e78b021-2) ...
Setting up golang-github-mitchellh-go-testing-interface-dev (1.0.0-1) ...
Setting up file (1:5.37-6) ...
Setting up golang-github-seccomp-libseccomp-golang-dev (0.9.1-1) ...
Setting up golang-github-asaskevich-govalidator-dev (9+git20180720.0.f9ffefc3-1) ...
Setting up golang-github-google-btree-dev (1.0.0-1) ...
Setting up golang-github-go-stack-stack-dev (1.5.2-2) ...
Setting up golang-github-beorn7-perks-dev (0.0~git20160804.0.4c0e845-1) ...
Setting up libicu63:armhf (63.2-2) ...
Setting up golang-github-hashicorp-go-cleanhttp-dev (0.5.1-1) ...
Setting up golang-github-hashicorp-errwrap-dev (1.0.0-1) ...
Setting up golang-github-cespare-xxhash-dev (2.1.0-1) ...
Setting up golang-github-spf13-pflag-dev (1.0.3-1) ...
Setting up golang-gopkg-tomb.v2-dev (0.0~git20161208.d5d1b58-3) ...
Setting up golang-github-bgentry-speakeasy-dev (0.1.0-1) ...
Setting up golang-github-jpillora-backoff-dev (1.0.0-1) ...
Setting up golang-github-davecgh-go-spew-dev (1.1.1-2) ...
Setting up autotools-dev (20180224.1) ...
Setting up libsasl2-dev (2.1.27+dfsg-1+b1) ...
Setting up golang-github-pascaldekloe-goe-dev (0.1.0-2) ...
Setting up golang-github-go-logfmt-logfmt-dev (0.3.0-1) ...
Setting up golang-github-ugorji-go-msgpack-dev (0.0~git20130605.792643-5) ...
Setting up golang-github-go-test-deep-dev (1.0.3-1) ...
Setting up bash-completion (1:2.8-6) ...
Setting up golang-github-hashicorp-go-immutable-radix-dev (1.1.0-1) ...
Setting up golang-github-boltdb-bolt-dev (1.3.1-6) ...
Setting up libncurses6:armhf (6.1+20191019-1) ...
Setting up libsigsegv2:armhf (2.12-2) ...
Setting up golang-github-xeipuuv-gojsonreference-dev (0.0~git20150808.0.e02fc20-2) ...
Setting up libmnl0:armhf (1.0.4-2) ...
Setting up golang-golang-x-sync-dev (0.0~git20190423.1122301-1) ...
Setting up autopoint (0.19.8.1-10) ...
Setting up golang-github-kr-pty-dev (1.1.6-1) ...
Setting up golang-github-opencontainers-selinux-dev (1.3.0-2) ...
Setting up pkg-config (0.29-6) ...
Setting up golang-github-hashicorp-hcl-dev (1.0.0-1) ...
Setting up golang-github-vishvananda-netns-dev (0.0~git20170707.0.86bef33-1) ...
Setting up golang-1.13-go (1.13.4-1+rpi1) ...
Setting up libxtables12:armhf (1.8.3-2) ...
Setting up golang-gocapability-dev (0.0+git20180916.d983527-1) ...
Setting up golang-glog-dev (0.0~git20160126.23def4e-3) ...
Setting up golang-github-julienschmidt-httprouter-dev (1.1-5) ...
Setting up golang-github-hashicorp-go-multierror-dev (1.0.0-1) ...
Setting up lsof (4.93.2+dfsg-1) ...
Setting up zlib1g-dev:armhf (1:1.2.11.dfsg-1) ...
Setting up golang-github-tv42-httpunix-dev (0.0~git20150427.b75d861-2) ...
Setting up golang-github-hashicorp-go-version-dev (1.2.0-1) ...
Setting up golang-gopkg-inf.v0-dev (0.9.0-3) ...
Setting up sensible-utils (0.0.12+nmu1) ...
Setting up libuchardet0:armhf (0.0.6-3) ...
Setting up golang-github-vishvananda-netlink-dev (1.0.0+git20181030.023a6da-1) ...
Setting up procps (2:3.3.15-2) ...
update-alternatives: using /usr/bin/w.procps to provide /usr/bin/w (w) in auto mode
Setting up golang-github-cyphar-filepath-securejoin-dev (0.2.2-1) ...
Setting up golang-github-modern-go-reflect2-dev (1.0.0-1) ...
Setting up libsub-override-perl (0.09-2) ...
Setting up golang-github-dgrijalva-jwt-go-dev (3.2.0-1) ...
Setting up golang-github-armon-go-radix-dev (1.0.0-1) ...
Setting up libprotobuf17:armhf (3.6.1.3-2+rpi1) ...
Setting up golang-github-datadog-datadog-go-dev (2.1.0-2) ...
Setting up libjs-jquery (3.3.1~dfsg-3) ...
Setting up golang-golang-x-xerrors-dev (0.0~git20190717.a985d34-1) ...
Setting up golang-procfs-dev (0.0.3-1) ...
Setting up golang-src (2:1.13~1+b11) ...
Setting up openssl (1.1.1d-2) ...
Setting up libbsd0:armhf (0.10.0-1) ...
Setting up libtinfo5:armhf (6.1+20191019-1) ...
Setting up libelf1:armhf (0.176-1.1) ...
Setting up golang-github-armon-circbuf-dev (0.0~git20150827.0.bbbad09-2) ...
Setting up golang-github-jeffail-gabs-dev (2.1.0-2) ...
Setting up libxml2:armhf (2.9.4+dfsg1-8) ...
Setting up golang-github-jefferai-jsonx-dev (1.0.1-2) ...
Setting up libsystemd-dev:armhf (243-8+rpi1) ...
Setting up golang-github-hashicorp-yamux-dev (0.0+git20190923.df201c7-1) ...
Setting up golang-github-hashicorp-go-rootcerts-dev (0.0~git20160503.0.6bb64b3-1) ...
Setting up golang-github-hashicorp-logutils-dev (0.0~git20150609.0.0dc08b1-1) ...
Setting up libfile-stripnondeterminism-perl (1.6.3-1) ...
Setting up golang-github-mattn-go-isatty-dev (0.0.8-2) ...
Setting up golang-github-hashicorp-go-reap-dev (0.0~git20160113.0.2d85522-3) ...
Setting up golang-github-digitalocean-godo-dev (1.1.0-1) ...
Setting up golang-github-hashicorp-go-memdb-dev (0.0~git20180224.1289e7ff-1) ...
Setting up libprotoc17:armhf (3.6.1.3-2+rpi1) ...
Setting up protobuf-compiler (3.6.1.3-2+rpi1) ...
Setting up libtool (2.4.6-11) ...
Setting up golang-go (2:1.13~1+b11) ...
Setting up golang-github-mattn-go-colorable-dev (0.0.9-3) ...
Setting up iproute2 (5.4.0-1) ...
Setting up golang-github-posener-complete-dev (1.1+git20180108.57878c9-3) ...
Setting up golang-github-docker-go-units-dev (0.4.0-1) ...
Setting up m4 (1.4.18-4) ...
Setting up golang-any (2:1.13~1+b11) ...
Setting up libprotobuf-dev:armhf (3.6.1.3-2+rpi1) ...
Setting up ca-certificates (20190110) ...
Updating certificates in /etc/ssl/certs...
128 added, 0 removed; done.
Setting up golang-goprotobuf-dev (1.3.2-2) ...
Setting up libjs-jquery-ui (1.12.1+dfsg-5) ...
Setting up golang-github-kr-text-dev (0.1.0-1) ...
Setting up golang-github-elazarl-go-bindata-assetfs-dev (1.0.0-1) ...
Setting up bsdmainutils (11.1.2) ...
update-alternatives: using /usr/bin/bsd-write to provide /usr/bin/write (write) in auto mode
update-alternatives: using /usr/bin/bsd-from to provide /usr/bin/from (from) in auto mode
Setting up libcroco3:armhf (0.6.13-1) ...
Setting up gogoprotobuf (1.2.1+git20190611.dadb6258-1) ...
Setting up autoconf (2.69-11) ...
Setting up dh-strip-nondeterminism (1.6.3-1) ...
Setting up dwz (0.13-4) ...
Setting up groff-base (1.22.4-3) ...
Setting up golang-github-prometheus-client-model-dev (0.0.2+git20171117.99fa1f4-1) ...
Setting up golang-github-docopt-docopt-go-dev (0.6.2+git20160216.0.784ddc5-1) ...
Setting up golang-github-hashicorp-go-checkpoint-dev (0.0~git20171009.1545e56-2) ...
Setting up automake (1:1.16.1-4) ...
update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode
Setting up golang-github-kr-pretty-dev (0.1.0-1) ...
Setting up gettext (0.19.8.1-10) ...
Setting up golang-github-peterbourgon-diskv-dev (3.0.0-1) ...
Setting up golang-github-fatih-color-dev (1.5.0-1) ...
Setting up golang-github-hashicorp-go-sockaddr-dev (0.0~git20170627.41949a1+ds-2) ...
Setting up golang-github-garyburd-redigo-dev (0.0~git20150901.0.d8dbe4d-2) ...
Setting up golang-protobuf-extensions-dev (1.0.1-1) ...
Setting up golang-gogoprotobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Setting up golang-gopkg-check.v1-dev (0.0+git20180628.788fd78-1) ...
Setting up man-db (2.9.0-1) ...
Not building database; man-db/auto-update is not 'true'.
Setting up golang-golang-x-tools (1:0.0~git20191118.07fc4c7+ds-1) ...
Setting up golang-github-mitchellh-reflectwalk-dev (0.0~git20170726.63d60e9-4) ...
Setting up golang-github-denverdino-aliyungo-dev (0.0~git20180921.13fa8aa-2) ...
Setting up intltool-debian (0.35.0+20060710.5) ...
Setting up golang-gopkg-mgo.v2-dev (2016.08.01-6) ...
Setting up golang-github-mitchellh-cli-dev (1.0.0-1) ...
Setting up golang-github-hashicorp-hil-dev (0.0~git20160711.1e86c6b-1) ...
Setting up golang-github-gogo-googleapis-dev (1.2.0-1) ...
Setting up golang-gopkg-yaml.v2-dev (2.2.2-1) ...
Setting up golang-github-imdario-mergo-dev (0.3.5-1) ...
Setting up po-debconf (1.0.21) ...
Setting up golang-gomega-dev (1.0+git20160910.d59fa0a-1) ...
Setting up golang-github-mitchellh-copystructure-dev (0.0~git20161013.0.5af94ae-2) ...
Setting up golang-github-stretchr-testify-dev (1.4.0+ds-1) ...
Setting up golang-github-shirou-gopsutil-dev (2.18.06-1) ...
Setting up golang-github-alecthomas-units-dev (0.0~git20151022.0.2efee85-4) ...
Setting up golang-github-ghodss-yaml-dev (1.0.0-1) ...
Setting up golang-github-jmespath-go-jmespath-dev (0.2.2-3) ...
Setting up golang-github-hashicorp-go-hclog-dev (0.10.0-1) ...
Setting up golang-github-urfave-cli-dev (1.20.0-1) ...
Setting up golang-github-sirupsen-logrus-dev (1.4.2-1) ...
Setting up golang-ginkgo-dev (1.2.0+git20161006.acfa16a-1) ...
Setting up golang-gopkg-alecthomas-kingpin.v2-dev (2.2.6-1) ...
Setting up golang-github-xeipuuv-gojsonschema-dev (0.0~git20170210.0.6b67b3f-2) ...
Setting up golang-github-nytimes-gziphandler-dev (1.1.1-1) ...
Setting up golang-github-json-iterator-go-dev (1.1.4-1) ...
Setting up golang-github-hashicorp-go-retryablehttp-dev (0.6.4-1) ...
Setting up golang-github-aws-aws-sdk-go-dev (1.21.6+dfsg-2) ...
Setting up golang-github-opencontainers-specs-dev (1.0.1+git20190408.a1b50f6-1) ...
Setting up golang-github-syndtr-goleveldb-dev (0.0~git20170725.0.b89cc31-2) ...
Setting up golang-github-circonus-labs-circonus-gometrics-dev (2.3.1-2) ...
Setting up golang-github-gregjones-httpcache-dev (0.0~git20180305.9cad4c3-1) ...
Setting up golang-google-genproto-dev (0.0~git20190801.fa694d8-2) ...
Setting up dh-autoreconf (19) ...
Setting up golang-github-coreos-go-systemd-dev (20-1) ...
Setting up golang-github-opencontainers-runc-dev (1.0.0~rc9+dfsg1-1+rpi1) ...
Setting up golang-golang-x-text-dev (0.3.2-1) ...
Setting up debhelper (12.7.1) ...
Setting up golang-github-sap-go-hdb-dev (0.14.1-2) ...
Setting up golang-golang-x-net-dev (1:0.0+git20191112.2180aed+dfsg-1) ...
Setting up golang-github-vmware-govmomi-dev (0.15.0-1) ...
Setting up golang-golang-x-crypto-dev (1:0.0~git20190701.4def268-2) ...
Setting up golang-golang-x-oauth2-dev (0.0~git20190604.0f29369-2) ...
Setting up golang-golang-x-time-dev (0.0~git20161028.0.f51c127-2) ...
Setting up golang-github-opentracing-opentracing-go-dev (1.0.2-1) ...
Setting up dh-golang (1.43) ...
Setting up golang-github-gophercloud-gophercloud-dev (0.6.0-1) ...
Setting up golang-github-miekg-dns-dev (1.0.4+ds-1) ...
Setting up golang-github-coreos-pkg-dev (4-2) ...
Setting up golang-github-mwitkow-go-conntrack-dev (0.0~git20190716.2f06839-1) ...
Setting up golang-google-cloud-compute-metadata-dev (0.43.0-1) ...
Setting up golang-golang-x-tools-dev (1:0.0~git20191118.07fc4c7+ds-1) ...
Setting up golang-github-docker-go-connections-dev (0.4.0-1) ...
Setting up golang-github-packethost-packngo-dev (0.2.0-2) ...
Setting up golang-golang-x-oauth2-google-dev (0.0~git20190604.0f29369-2) ...
Setting up golang-gopkg-square-go-jose.v2-dev (2.3.1-1) ...
Setting up golang-github-azure-go-autorest-dev (10.15.5-1) ...
Setting up golang-github-googleapis-gnostic-dev (0.2.0-1) ...
Setting up golang-google-grpc-dev (1.22.1-1) ...
Setting up golang-github-ugorji-go-codec-dev (1.1.7-1) ...
Setting up golang-gopkg-vmihailenco-msgpack.v2-dev (3.3.3-1) ...
Setting up golang-go.opencensus-dev (0.22.0-1) ...
Setting up golang-github-hashicorp-mdns-dev (1.0.1-1) ...
Setting up golang-github-go-kit-kit-dev (0.6.0-2) ...
Setting up golang-github-hashicorp-go-msgpack-dev (0.5.5-1) ...
Setting up golang-github-hashicorp-net-rpc-msgpackrpc-dev (0.0~git20151116.0.a14192a-1) ...
Setting up golang-github-prometheus-common-dev (0.7.0-1) ...
Setting up golang-google-api-dev (0.7.0-2) ...
Setting up golang-github-prometheus-client-golang-dev (1.2.1-3) ...
Setting up golang-github-hashicorp-go-discover-dev (0.0+git20190905.34a6505-2) ...
Setting up golang-github-armon-go-metrics-dev (0.0~git20190430.ec5e00d-1) ...
Setting up golang-github-hashicorp-raft-dev (1.1.1-2) ...
Setting up golang-github-hashicorp-scada-client-dev (0.0~git20160601.0.6e89678-2) ...
Setting up golang-github-hashicorp-memberlist-dev (0.1.5-2) ...
Setting up golang-github-hashicorp-go-raftchunking-dev (0.6.2-2) ...
Setting up golang-github-hashicorp-raft-boltdb-dev (0.0~git20171010.6e5ba93-3) ...
Setting up golang-github-hashicorp-serf-dev (0.8.5~ds1-1) ...
Setting up sbuild-build-depends-consul-dummy (0.invalid.0) ...
Processing triggers for libc-bin (2.29-2+rpi1) ...
Processing triggers for ca-certificates (20190110) ...
Updating certificates in /etc/ssl/certs...
0 added, 0 removed; done.
Running hooks in /etc/ca-certificates/update.d...
done.

+------------------------------------------------------------------------------+
| Build environment                                                            |
+------------------------------------------------------------------------------+

Kernel: Linux 4.9.0-0.bpo.1-armmp armhf (armv7l)
Toolchain package versions: binutils_2.33.1-2+rpi1 dpkg-dev_1.19.7 g++-9_9.2.1-17+rpi1 gcc-9_9.2.1-17+rpi1 libc6-dev_2.29-2+rpi1 libstdc++-9-dev_9.2.1-17+rpi1 libstdc++6_9.2.1-17+rpi1 linux-libc-dev_5.2.17-1+rpi1+b2
Package versions: adduser_3.118 apt_1.8.4 autoconf_2.69-11 automake_1:1.16.1-4 autopoint_0.19.8.1-10 autotools-dev_20180224.1 base-files_11+rpi1 base-passwd_3.5.46 bash_5.0-5 bash-completion_1:2.8-6 binutils_2.33.1-2+rpi1 binutils-arm-linux-gnueabihf_2.33.1-2+rpi1 binutils-common_2.33.1-2+rpi1 bsdmainutils_11.1.2 bsdutils_1:2.34-0.1 build-essential_12.8 bzip2_1.0.8-2 ca-certificates_20190110 coreutils_8.30-3 cpp_4:9.2.1-3+rpi1 cpp-9_9.2.1-17+rpi1 dash_0.5.10.2-6 debconf_1.5.73 debhelper_12.7.1 debianutils_4.9 dh-autoreconf_19 dh-golang_1.43 dh-strip-nondeterminism_1.6.3-1 diffutils_1:3.7-3 dirmngr_2.2.17-3+b1 dpkg_1.19.7 dpkg-dev_1.19.7 dwz_0.13-4 e2fsprogs_1.45.4-1 fakeroot_1.24-1 fdisk_2.34-0.1 file_1:5.37-6 findutils_4.7.0-1 g++_4:9.2.1-3+rpi1 g++-9_9.2.1-17+rpi1 gcc_4:9.2.1-3+rpi1 gcc-4.9-base_4.9.4-2+rpi1+b19 gcc-5-base_5.5.0-8 gcc-6-base_6.5.0-1+rpi1+b3 gcc-9_9.2.1-17+rpi1 gcc-9-base_9.2.1-17+rpi1 gettext_0.19.8.1-10 gettext-base_0.19.8.1-10 gnupg_2.2.17-3 gnupg-l10n_2.2.17-3 gnupg-utils_2.2.17-3+b1 gogoprotobuf_1.2.1+git20190611.dadb6258-1 golang-1.13-go_1.13.4-1+rpi1 golang-1.13-src_1.13.4-1+rpi1 golang-any_2:1.13~1+b11 golang-dbus-dev_5.0.2-1 golang-ginkgo-dev_1.2.0+git20161006.acfa16a-1 golang-github-alecthomas-units-dev_0.0~git20151022.0.2efee85-4 golang-github-armon-circbuf-dev_0.0~git20150827.0.bbbad09-2 golang-github-armon-go-metrics-dev_0.0~git20190430.ec5e00d-1 golang-github-armon-go-radix-dev_1.0.0-1 golang-github-asaskevich-govalidator-dev_9+git20180720.0.f9ffefc3-1 golang-github-aws-aws-sdk-go-dev_1.21.6+dfsg-2 golang-github-azure-go-autorest-dev_10.15.5-1 golang-github-beorn7-perks-dev_0.0~git20160804.0.4c0e845-1 golang-github-bgentry-speakeasy-dev_0.1.0-1 golang-github-boltdb-bolt-dev_1.3.1-6 golang-github-bradfitz-gomemcache-dev_0.0~git20141109-3 golang-github-cespare-xxhash-dev_2.1.0-1 golang-github-circonus-labs-circonus-gometrics-dev_2.3.1-2 golang-github-circonus-labs-circonusllhist-dev_0.0~git20160526.0.d724266-2 golang-github-coreos-go-systemd-dev_20-1 golang-github-coreos-pkg-dev_4-2 golang-github-cyphar-filepath-securejoin-dev_0.2.2-1 golang-github-datadog-datadog-go-dev_2.1.0-2 golang-github-davecgh-go-spew-dev_1.1.1-2 golang-github-denverdino-aliyungo-dev_0.0~git20180921.13fa8aa-2 golang-github-dgrijalva-jwt-go-dev_3.2.0-1 golang-github-dgrijalva-jwt-go-v3-dev_3.2.0-2 golang-github-digitalocean-godo-dev_1.1.0-1 golang-github-dimchansky-utfbom-dev_0.0~git20170328.6c6132f-1 golang-github-docker-go-connections-dev_0.4.0-1 golang-github-docker-go-units-dev_0.4.0-1 golang-github-docopt-docopt-go-dev_0.6.2+git20160216.0.784ddc5-1 golang-github-elazarl-go-bindata-assetfs-dev_1.0.0-1 golang-github-fatih-color-dev_1.5.0-1 golang-github-garyburd-redigo-dev_0.0~git20150901.0.d8dbe4d-2 golang-github-ghodss-yaml-dev_1.0.0-1 golang-github-go-ini-ini-dev_1.32.0-2 golang-github-go-kit-kit-dev_0.6.0-2 golang-github-go-logfmt-logfmt-dev_0.3.0-1 golang-github-go-stack-stack-dev_1.5.2-2 golang-github-go-test-deep-dev_1.0.3-1 golang-github-gogo-googleapis-dev_1.2.0-1 golang-github-gogo-protobuf-dev_1.2.1+git20190611.dadb6258-1 golang-github-golang-mock-dev_1.3.1-2 golang-github-golang-snappy-dev_0.0+git20160529.d9eb7a3-3 golang-github-google-btree-dev_1.0.0-1 golang-github-google-go-cmp-dev_0.3.1-1 golang-github-google-go-querystring-dev_1.0.0-1 golang-github-google-gofuzz-dev_0.0~git20170612.24818f7-1 golang-github-googleapis-gnostic-dev_0.2.0-1 golang-github-gophercloud-gophercloud-dev_0.6.0-1 golang-github-gregjones-httpcache-dev_0.0~git20180305.9cad4c3-1 golang-github-hashicorp-errwrap-dev_1.0.0-1 golang-github-hashicorp-go-checkpoint-dev_0.0~git20171009.1545e56-2 golang-github-hashicorp-go-cleanhttp-dev_0.5.1-1 golang-github-hashicorp-go-discover-dev_0.0+git20190905.34a6505-2 golang-github-hashicorp-go-hclog-dev_0.10.0-1 golang-github-hashicorp-go-immutable-radix-dev_1.1.0-1 golang-github-hashicorp-go-memdb-dev_0.0~git20180224.1289e7ff-1 golang-github-hashicorp-go-msgpack-dev_0.5.5-1 golang-github-hashicorp-go-multierror-dev_1.0.0-1 golang-github-hashicorp-go-raftchunking-dev_0.6.2-2 golang-github-hashicorp-go-reap-dev_0.0~git20160113.0.2d85522-3 golang-github-hashicorp-go-retryablehttp-dev_0.6.4-1 golang-github-hashicorp-go-rootcerts-dev_0.0~git20160503.0.6bb64b3-1 golang-github-hashicorp-go-sockaddr-dev_0.0~git20170627.41949a1+ds-2 golang-github-hashicorp-go-syslog-dev_0.0~git20150218.0.42a2b57-1 golang-github-hashicorp-go-uuid-dev_1.0.1-1 golang-github-hashicorp-go-version-dev_1.2.0-1 golang-github-hashicorp-golang-lru-dev_0.5.0-1 golang-github-hashicorp-hcl-dev_1.0.0-1 golang-github-hashicorp-hil-dev_0.0~git20160711.1e86c6b-1 golang-github-hashicorp-logutils-dev_0.0~git20150609.0.0dc08b1-1 golang-github-hashicorp-mdns-dev_1.0.1-1 golang-github-hashicorp-memberlist-dev_0.1.5-2 golang-github-hashicorp-net-rpc-msgpackrpc-dev_0.0~git20151116.0.a14192a-1 golang-github-hashicorp-raft-boltdb-dev_0.0~git20171010.6e5ba93-3 golang-github-hashicorp-raft-dev_1.1.1-2 golang-github-hashicorp-scada-client-dev_0.0~git20160601.0.6e89678-2 golang-github-hashicorp-serf-dev_0.8.5~ds1-1 golang-github-hashicorp-yamux-dev_0.0+git20190923.df201c7-1 golang-github-imdario-mergo-dev_0.3.5-1 golang-github-inconshreveable-muxado-dev_0.0~git20140312.0.f693c7e-2 golang-github-jeffail-gabs-dev_2.1.0-2 golang-github-jefferai-jsonx-dev_1.0.1-2 golang-github-jmespath-go-jmespath-dev_0.2.2-3 golang-github-jpillora-backoff-dev_1.0.0-1 golang-github-json-iterator-go-dev_1.1.4-1 golang-github-julienschmidt-httprouter-dev_1.1-5 golang-github-kr-pretty-dev_0.1.0-1 golang-github-kr-pty-dev_1.1.6-1 golang-github-kr-text-dev_0.1.0-1 golang-github-mattn-go-colorable-dev_0.0.9-3 golang-github-mattn-go-isatty-dev_0.0.8-2 golang-github-miekg-dns-dev_1.0.4+ds-1 golang-github-mitchellh-cli-dev_1.0.0-1 golang-github-mitchellh-copystructure-dev_0.0~git20161013.0.5af94ae-2 golang-github-mitchellh-go-homedir-dev_1.1.0-1 golang-github-mitchellh-go-testing-interface-dev_1.0.0-1 golang-github-mitchellh-hashstructure-dev_1.0.0-1 golang-github-mitchellh-mapstructure-dev_1.1.2-1 golang-github-mitchellh-reflectwalk-dev_0.0~git20170726.63d60e9-4 golang-github-modern-go-concurrent-dev_1.0.3-1 golang-github-modern-go-reflect2-dev_1.0.0-1 golang-github-mwitkow-go-conntrack-dev_0.0~git20190716.2f06839-1 golang-github-nytimes-gziphandler-dev_1.1.1-1 golang-github-opencontainers-runc-dev_1.0.0~rc9+dfsg1-1+rpi1 golang-github-opencontainers-selinux-dev_1.3.0-2 golang-github-opencontainers-specs-dev_1.0.1+git20190408.a1b50f6-1 golang-github-opentracing-opentracing-go-dev_1.0.2-1 golang-github-packethost-packngo-dev_0.2.0-2 golang-github-pascaldekloe-goe-dev_0.1.0-2 golang-github-peterbourgon-diskv-dev_3.0.0-1 golang-github-pkg-errors-dev_0.8.1-1 golang-github-pmezard-go-difflib-dev_1.0.0-2 golang-github-posener-complete-dev_1.1+git20180108.57878c9-3 golang-github-prometheus-client-golang-dev_1.2.1-3 golang-github-prometheus-client-model-dev_0.0.2+git20171117.99fa1f4-1 golang-github-prometheus-common-dev_0.7.0-1 golang-github-ryanuber-columnize-dev_2.1.1-1 golang-github-ryanuber-go-glob-dev_1.0.0-2 golang-github-sap-go-hdb-dev_0.14.1-2 golang-github-seccomp-libseccomp-golang-dev_0.9.1-1 golang-github-shirou-gopsutil-dev_2.18.06-1 golang-github-sirupsen-logrus-dev_1.4.2-1 golang-github-spf13-pflag-dev_1.0.3-1 golang-github-stretchr-objx-dev_0.1.1+git20180825.ef50b0d-1 golang-github-stretchr-testify-dev_1.4.0+ds-1 golang-github-syndtr-goleveldb-dev_0.0~git20170725.0.b89cc31-2 golang-github-tent-http-link-go-dev_0.0~git20130702.0.ac974c6-6 golang-github-tv42-httpunix-dev_0.0~git20150427.b75d861-2 golang-github-ugorji-go-codec-dev_1.1.7-1 golang-github-ugorji-go-msgpack-dev_0.0~git20130605.792643-5 golang-github-urfave-cli-dev_1.20.0-1 golang-github-vishvananda-netlink-dev_1.0.0+git20181030.023a6da-1 golang-github-vishvananda-netns-dev_0.0~git20170707.0.86bef33-1 golang-github-vmware-govmomi-dev_0.15.0-1 golang-github-xeipuuv-gojsonpointer-dev_0.0~git20151027.0.e0fe6f6-2 golang-github-xeipuuv-gojsonreference-dev_0.0~git20150808.0.e02fc20-2 golang-github-xeipuuv-gojsonschema-dev_0.0~git20170210.0.6b67b3f-2 golang-glog-dev_0.0~git20160126.23def4e-3 golang-go_2:1.13~1+b11 golang-go.opencensus-dev_0.22.0-1 golang-gocapability-dev_0.0+git20180916.d983527-1 golang-gogoprotobuf-dev_1.2.1+git20190611.dadb6258-1 golang-golang-x-crypto-dev_1:0.0~git20190701.4def268-2 golang-golang-x-net-dev_1:0.0+git20191112.2180aed+dfsg-1 golang-golang-x-oauth2-dev_0.0~git20190604.0f29369-2 golang-golang-x-oauth2-google-dev_0.0~git20190604.0f29369-2 golang-golang-x-sync-dev_0.0~git20190423.1122301-1 golang-golang-x-sys-dev_0.0~git20190726.fc99dfb-1 golang-golang-x-text-dev_0.3.2-1 golang-golang-x-time-dev_0.0~git20161028.0.f51c127-2 golang-golang-x-tools_1:0.0~git20191118.07fc4c7+ds-1 golang-golang-x-tools-dev_1:0.0~git20191118.07fc4c7+ds-1 golang-golang-x-xerrors-dev_0.0~git20190717.a985d34-1 golang-gomega-dev_1.0+git20160910.d59fa0a-1 golang-google-api-dev_0.7.0-2 golang-google-cloud-compute-metadata-dev_0.43.0-1 golang-google-genproto-dev_0.0~git20190801.fa694d8-2 golang-google-grpc-dev_1.22.1-1 golang-gopkg-alecthomas-kingpin.v2-dev_2.2.6-1 golang-gopkg-check.v1-dev_0.0+git20180628.788fd78-1 golang-gopkg-inf.v0-dev_0.9.0-3 golang-gopkg-mgo.v2-dev_2016.08.01-6 golang-gopkg-square-go-jose.v2-dev_2.3.1-1 golang-gopkg-tomb.v2-dev_0.0~git20161208.d5d1b58-3 golang-gopkg-vmihailenco-msgpack.v2-dev_3.3.3-1 golang-gopkg-yaml.v2-dev_2.2.2-1 golang-goprotobuf-dev_1.3.2-2 golang-procfs-dev_0.0.3-1 golang-protobuf-extensions-dev_1.0.1-1 golang-src_2:1.13~1+b11 gpg_2.2.17-3+b1 gpg-agent_2.2.17-3+b1 gpg-wks-client_2.2.17-3+b1 gpg-wks-server_2.2.17-3+b1 gpgconf_2.2.17-3+b1 gpgsm_2.2.17-3+b1 gpgv_2.2.17-3+b1 grep_3.3-1 groff-base_1.22.4-3 gzip_1.9-3 hostname_3.23 init-system-helpers_1.57 intltool-debian_0.35.0+20060710.5 iproute2_5.4.0-1 iputils-ping_3:20190709-2 libacl1_2.2.53-5 libapt-pkg5.0_1.8.4 libarchive-zip-perl_1.67-1 libasan5_9.2.1-17+rpi1 libassuan0_2.5.3-7 libatomic1_9.2.1-17+rpi1 libattr1_1:2.4.48-5 libaudit-common_1:2.8.5-2 libaudit1_1:2.8.5-2 libbinutils_2.33.1-2+rpi1 libblkid1_2.34-0.1 libbsd0_0.10.0-1 libbz2-1.0_1.0.8-2 libc-bin_2.29-2+rpi1 libc-dev-bin_2.29-2+rpi1 libc6_2.29-2+rpi1 libc6-dev_2.29-2+rpi1 libcap-ng0_0.7.9-2.1 libcap2_1:2.27-1 libcap2-bin_1:2.27-1 libcc1-0_9.2.1-17+rpi1 libcom-err2_1.45.4-1 libcroco3_0.6.13-1 libdb5.3_5.3.28+dfsg1-0.6 libdebconfclient0_0.250 libdebhelper-perl_12.7.1 libdpkg-perl_1.19.7 libelf1_0.176-1.1 libext2fs2_1.45.4-1 libfakeroot_1.24-1 libfdisk1_2.34-0.1 libffi6_3.2.1-9 libfile-stripnondeterminism-perl_1.6.3-1 libgcc-9-dev_9.2.1-17+rpi1 libgcc1_1:9.2.1-17+rpi1 libgcrypt20_1.8.5-3 libgdbm-compat4_1.18.1-5 libgdbm6_1.18.1-5 libglib2.0-0_2.62.3-2 libgmp10_2:6.1.2+dfsg-4 libgnutls30_3.6.10-4 libgomp1_9.2.1-17+rpi1 libgpg-error0_1.36-7 libhogweed5_3.5.1+really3.5.1-2 libicu63_63.2-2 libidn2-0_2.2.0-2 libisl19_0.20-2 libisl21_0.21-2 libjs-jquery_3.3.1~dfsg-3 libjs-jquery-ui_1.12.1+dfsg-5 libksba8_1.3.5-2 libldap-2.4-2_2.4.48+dfsg-1+b2 libldap-common_2.4.48+dfsg-1 liblz4-1_1.9.2-1 liblzma5_5.2.4-1 libmagic-mgc_1:5.37-6 libmagic1_1:5.37-6 libmnl0_1.0.4-2 libmount1_2.34-0.1 libmpc3_1.1.0-1 libmpfr6_4.0.2-1 libncurses6_6.1+20191019-1 libncursesw6_6.1+20191019-1 libnettle7_3.5.1+really3.5.1-2 libnpth0_1.6-1 libp11-kit0_0.23.18.1-2 libpam-cap_1:2.27-1 libpam-modules_1.3.1-5 libpam-modules-bin_1.3.1-5 libpam-runtime_1.3.1-5 libpam0g_1.3.1-5 libpcre2-8-0_10.32-5 libpcre3_2:8.39-12 libperl5.30_5.30.0-9 libpipeline1_1.5.1-2 libprocps7_2:3.3.15-2 libprotobuf-dev_3.6.1.3-2+rpi1 libprotobuf-lite17_3.6.1.3-2+rpi1 libprotobuf17_3.6.1.3-2+rpi1 libprotoc17_3.6.1.3-2+rpi1 libreadline7_7.0-5 libreadline8_8.0-3 libsasl2-2_2.1.27+dfsg-1+b1 libsasl2-dev_2.1.27+dfsg-1+b1 libsasl2-modules-db_2.1.27+dfsg-1+b1 libseccomp-dev_2.4.2-2+rpi1 libseccomp2_2.4.2-2+rpi1 libselinux1_2.9-2 libsemanage-common_2.9-3 libsemanage1_2.9-3 libsepol1_2.9-2 libsigsegv2_2.12-2 libsmartcols1_2.34-0.1 libsqlite3-0_3.30.1-1 libss2_1.45.4-1 libssl1.1_1.1.1d-2 libstdc++-9-dev_9.2.1-17+rpi1 libstdc++6_9.2.1-17+rpi1 libsub-override-perl_0.09-2 libsystemd-dev_243-8+rpi1 libsystemd0_243-8+rpi1 libtasn1-6_4.14-3 libtinfo5_6.1+20191019-1 libtinfo6_6.1+20191019-1 libtool_2.4.6-11 libubsan1_9.2.1-17+rpi1 libuchardet0_0.0.6-3 libudev1_242-7+rpi1 libunistring2_0.9.10-2 libuuid1_2.34-0.1 libxml2_2.9.4+dfsg1-8 libxtables12_1.8.3-2 libzstd1_1.4.3+dfsg-1+rpi1 linux-libc-dev_5.2.17-1+rpi1+b2 login_1:4.7-2 logsave_1.45.4-1 lsb-base_11.1.0+rpi1 lsof_4.93.2+dfsg-1 m4_1.4.18-4 make_4.2.1-1.2 man-db_2.9.0-1 mawk_1.3.3-17 mockery_0.0~git20181123.e78b021-2 mount_2.34-0.1 ncurses-base_6.1+20191019-1 ncurses-bin_6.1+20191019-1 netbase_5.6 openssl_1.1.1d-2 passwd_1:4.7-2 patch_2.7.6-6 perl_5.30.0-9 perl-base_5.30.0-9 perl-modules-5.30_5.30.0-9 pinentry-curses_1.1.0-3 pkg-config_0.29-6 po-debconf_1.0.21 procps_2:3.3.15-2 protobuf-compiler_3.6.1.3-2+rpi1 raspbian-archive-keyring_20120528.2 readline-common_8.0-3 sbuild-build-depends-consul-dummy_0.invalid.0 sbuild-build-depends-core-dummy_0.invalid.0 sed_4.7-1 sensible-utils_0.0.12+nmu1 sysvinit-utils_2.96-1 tar_1.30+dfsg-6 tzdata_2019c-3 util-linux_2.34-0.1 xz-utils_5.2.4-1 zlib1g_1:1.2.11.dfsg-1 zlib1g-dev_1:1.2.11.dfsg-1

+------------------------------------------------------------------------------+
| Build                                                                        |
+------------------------------------------------------------------------------+


Unpack source
-------------

gpgv: unknown type of key resource 'trustedkeys.kbx'
gpgv: keyblock resource '/sbuild-nonexistent/.gnupg/trustedkeys.kbx': General error
gpgv: Signature made Sun Dec  1 00:03:51 2019 UTC
gpgv:                using RSA key 50BC7CF939D20C272A6B065652B6BBD953968D1B
gpgv: Can't check signature: No public key
dpkg-source: warning: failed to verify signature on ./consul_1.5.2+dfsg1-6.dsc
dpkg-source: info: extracting consul in /<<BUILDDIR>>/consul-1.5.2+dfsg1
dpkg-source: info: unpacking consul_1.5.2+dfsg1.orig.tar.xz
dpkg-source: info: unpacking consul_1.5.2+dfsg1-6.debian.tar.xz
dpkg-source: info: using patch list from debian/patches/series
dpkg-source: info: applying provider-no-k8s.patch
dpkg-source: info: applying t-skip-unreliable-tests.patch
dpkg-source: info: applying vendor-envoyproxy.patch

Check disc space
----------------

Sufficient free space for build

User Environment
----------------

APT_CONFIG=/var/lib/sbuild/apt.conf
DEB_BUILD_OPTIONS=parallel=4
HOME=/sbuild-nonexistent
LC_ALL=POSIX
LOGNAME=buildd
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games
SCHROOT_ALIAS_NAME=bullseye-staging-armhf-sbuild
SCHROOT_CHROOT_NAME=bullseye-staging-armhf-sbuild
SCHROOT_COMMAND=env
SCHROOT_GID=109
SCHROOT_GROUP=buildd
SCHROOT_SESSION_ID=bullseye-staging-armhf-sbuild-8476812f-5823-4527-9126-cc5f34c49e67
SCHROOT_UID=104
SCHROOT_USER=buildd
SHELL=/bin/sh
TERM=linux
USER=buildd

dpkg-buildpackage
-----------------

dpkg-buildpackage: info: source package consul
dpkg-buildpackage: info: source version 1.5.2+dfsg1-6
dpkg-buildpackage: info: source distribution unstable
 dpkg-source --before-build .
dpkg-buildpackage: info: host architecture armhf
 fakeroot debian/rules clean
dh clean --buildsystem=golang --with=golang,bash-completion --builddirectory=_build
   dh_auto_clean -O--buildsystem=golang -O--builddirectory=_build
   dh_autoreconf_clean -O--buildsystem=golang -O--builddirectory=_build
   debian/rules override_dh_clean
make[1]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
dh_clean
## Remove Files-Excluded (when built from checkout or non-DFSG tarball):
rm -f -rv `perl -0nE 'say $1 if m{^Files\-Excluded\:\s*(.*?)(?:\n\n|Files:|Comment:)}sm;' debian/copyright`
find vendor -type d -empty -delete -print
vendor/github.com/Azure
vendor/github.com/DataDog
vendor/github.com/Jeffail
vendor/github.com/Microsoft
vendor/github.com/NYTimes
vendor/github.com/SAP
vendor/github.com/SermoDigital
vendor/github.com/StackExchange
vendor/github.com/armon
vendor/github.com/asaskevich
vendor/github.com/aws
vendor/github.com/beorn7
vendor/github.com/bgentry
vendor/github.com/boltdb
vendor/github.com/circonus-labs
vendor/github.com/davecgh
vendor/github.com/denisenkom
vendor/github.com/denverdino
vendor/github.com/dgrijalva
vendor/github.com/digitalocean
vendor/github.com/docker
vendor/github.com/elazarl
vendor/github.com/fatih
vendor/github.com/ghodss
vendor/github.com/go-ini
vendor/github.com/go-ole
vendor/github.com/go-sql-driver
vendor/github.com/gocql
vendor/github.com/gogo
vendor/github.com/golang
vendor/github.com/google
vendor/github.com/googleapis
vendor/github.com/gophercloud
vendor/github.com/gregjones
vendor/github.com/hailocab
vendor/github.com/imdario
vendor/github.com/jefferai
vendor/github.com/jmespath
vendor/github.com/joyent
vendor/github.com/json-iterator
vendor/github.com/keybase
vendor/github.com/kr
vendor/github.com/lib
vendor/github.com/mattn
vendor/github.com/matttproud
vendor/github.com/miekg
vendor/github.com/mitchellh
vendor/github.com/modern-go
vendor/github.com/nicolai86
vendor/github.com/packethost
vendor/github.com/pascaldekloe
vendor/github.com/patrickmn
vendor/github.com/peterbourgon
vendor/github.com/pkg
vendor/github.com/pmezard
vendor/github.com/posener
vendor/github.com/prometheus
vendor/github.com/renier
vendor/github.com/ryanuber
vendor/github.com/shirou
vendor/github.com/sirupsen
vendor/github.com/softlayer
vendor/github.com/spf13
vendor/github.com/stretchr
vendor/github.com/tv42
vendor/github.com/vmware
vendor/golang.org
vendor/gopkg.in/square
vendor/gopkg.in
rm -f -r test/integration
make[1]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
 debian/rules build-arch
dh build-arch --buildsystem=golang --with=golang,bash-completion --builddirectory=_build
   dh_update_autotools_config -a -O--buildsystem=golang -O--builddirectory=_build
   dh_autoreconf -a -O--buildsystem=golang -O--builddirectory=_build
   debian/rules override_dh_auto_configure
make[1]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
dh_auto_configure
mkdir -v -p _build/src/github.com/keybase/
mkdir: created directory '_build/src/github.com/keybase/'
ln -sv /usr/share/gocode/src/golang.org/x/crypto  _build/src/github.com/keybase/go-crypto
'_build/src/github.com/keybase/go-crypto' -> '/usr/share/gocode/src/golang.org/x/crypto'
mkdir -v -p _build/src/github.com/SermoDigital/
mkdir: created directory '_build/src/github.com/SermoDigital/'
ln -sv /usr/share/gocode/src/gopkg.in/square/go-jose.v1  _build/src/github.com/SermoDigital/jose
'_build/src/github.com/SermoDigital/jose' -> '/usr/share/gocode/src/gopkg.in/square/go-jose.v1'
make[1]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
   debian/rules override_dh_auto_build
make[1]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
export GOPATH=/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build \
        && /usr/bin/make -C _build/src/github.com/hashicorp/consul --makefile=/<<BUILDDIR>>/consul-1.5.2+dfsg1/GNUmakefile proto
make[2]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul'
protoc agent/connect/ca/plugin/*.proto --gofast_out=plugins=grpc:../../..
bash: git: command not found
bash: git: command not found
failed to initialize build cache at /sbuild-nonexistent/.cache/go-build: mkdir /sbuild-nonexistent: permission denied
bash: git: command not found
bash: git: command not found
bash: git: command not found
bash: git: command not found
make[2]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul'
dh_auto_build -v
	cd _build && go generate -v github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/cache github.com/hashicorp/consul/agent/cache-types github.com/hashicorp/consul/agent/checks github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/connect github.com/hashicorp/consul/agent/connect/ca github.com/hashicorp/consul/agent/connect/ca/plugin github.com/hashicorp/consul/agent/consul github.com/hashicorp/consul/agent/consul/authmethod github.com/hashicorp/consul/agent/consul/authmethod/kubeauth github.com/hashicorp/consul/agent/consul/authmethod/testauth github.com/hashicorp/consul/agent/consul/autopilot github.com/hashicorp/consul/agent/consul/fsm github.com/hashicorp/consul/agent/consul/prepared_query github.com/hashicorp/consul/agent/consul/state github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/api github.com/hashicorp/consul/api/watch github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/authmethod github.com/hashicorp/consul/command/acl/authmethod/create github.com/hashicorp/consul/command/acl/authmethod/delete github.com/hashicorp/consul/command/acl/authmethod/list github.com/hashicorp/consul/command/acl/authmethod/read github.com/hashicorp/consul/command/acl/authmethod/update github.com/hashicorp/consul/command/acl/bindingrule github.com/hashicorp/consul/command/acl/bindingrule/create github.com/hashicorp/consul/command/acl/bindingrule/delete github.com/hashicorp/consul/command/acl/bindingrule/list github.com/hashicorp/consul/command/acl/bindingrule/read github.com/hashicorp/consul/command/acl/bindingrule/update github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/role github.com/hashicorp/consul/command/acl/role/create github.com/hashicorp/consul/command/acl/role/delete github.com/hashicorp/consul/command/acl/role/list github.com/hashicorp/consul/command/acl/role/read github.com/hashicorp/consul/command/acl/role/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/config github.com/hashicorp/consul/command/config/delete github.com/hashicorp/consul/command/config/list github.com/hashicorp/consul/command/config/read github.com/hashicorp/consul/command/config/write github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/login github.com/hashicorp/consul/command/logout github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/tls github.com/hashicorp/consul/command/tls/ca github.com/hashicorp/consul/command/tls/ca/create github.com/hashicorp/consul/command/tls/cert github.com/hashicorp/consul/command/tls/cert/create github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sdk/freeport github.com/hashicorp/consul/sdk/testutil github.com/hashicorp/consul/sdk/testutil/retry github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version
src/github.com/hashicorp/consul/main.go
src/github.com/hashicorp/consul/main_test.go
src/github.com/hashicorp/consul/acl/acl.go
src/github.com/hashicorp/consul/acl/acl_test.go
src/github.com/hashicorp/consul/acl/errors.go
src/github.com/hashicorp/consul/acl/policy.go
src/github.com/hashicorp/consul/acl/policy_test.go
src/github.com/hashicorp/consul/agent/acl.go
src/github.com/hashicorp/consul/agent/acl_endpoint.go
src/github.com/hashicorp/consul/agent/acl_endpoint_legacy.go
src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go
src/github.com/hashicorp/consul/agent/acl_endpoint_test.go
src/github.com/hashicorp/consul/agent/acl_test.go
src/github.com/hashicorp/consul/agent/agent.go
src/github.com/hashicorp/consul/agent/agent_endpoint.go
src/github.com/hashicorp/consul/agent/agent_endpoint_test.go
src/github.com/hashicorp/consul/agent/agent_test.go
src/github.com/hashicorp/consul/agent/bindata_assetfs.go
src/github.com/hashicorp/consul/agent/blacklist.go
src/github.com/hashicorp/consul/agent/blacklist_test.go
src/github.com/hashicorp/consul/agent/catalog_endpoint.go
src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go
src/github.com/hashicorp/consul/agent/check.go
src/github.com/hashicorp/consul/agent/config.go
src/github.com/hashicorp/consul/agent/config_endpoint.go
src/github.com/hashicorp/consul/agent/config_endpoint_test.go
src/github.com/hashicorp/consul/agent/connect_auth.go
src/github.com/hashicorp/consul/agent/connect_ca_endpoint.go
src/github.com/hashicorp/consul/agent/connect_ca_endpoint_test.go
src/github.com/hashicorp/consul/agent/coordinate_endpoint.go
src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go
src/github.com/hashicorp/consul/agent/dns.go
src/github.com/hashicorp/consul/agent/dns_test.go
src/github.com/hashicorp/consul/agent/enterprise_delegate_oss.go
src/github.com/hashicorp/consul/agent/event_endpoint.go
src/github.com/hashicorp/consul/agent/event_endpoint_test.go
src/github.com/hashicorp/consul/agent/health_endpoint.go
src/github.com/hashicorp/consul/agent/health_endpoint_test.go
src/github.com/hashicorp/consul/agent/http.go
src/github.com/hashicorp/consul/agent/http_oss.go
src/github.com/hashicorp/consul/agent/http_oss_test.go
src/github.com/hashicorp/consul/agent/http_test.go
src/github.com/hashicorp/consul/agent/intentions_endpoint.go
src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go
src/github.com/hashicorp/consul/agent/keyring.go
src/github.com/hashicorp/consul/agent/keyring_test.go
src/github.com/hashicorp/consul/agent/kvs_endpoint.go
src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go
src/github.com/hashicorp/consul/agent/notify.go
src/github.com/hashicorp/consul/agent/notify_test.go
src/github.com/hashicorp/consul/agent/operator_endpoint.go
src/github.com/hashicorp/consul/agent/operator_endpoint_test.go
src/github.com/hashicorp/consul/agent/prepared_query_endpoint.go
src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go
src/github.com/hashicorp/consul/agent/remote_exec.go
src/github.com/hashicorp/consul/agent/remote_exec_test.go
src/github.com/hashicorp/consul/agent/retry_join.go
src/github.com/hashicorp/consul/agent/service_manager.go
src/github.com/hashicorp/consul/agent/service_manager_test.go
src/github.com/hashicorp/consul/agent/session_endpoint.go
src/github.com/hashicorp/consul/agent/session_endpoint_test.go
src/github.com/hashicorp/consul/agent/sidecar_service.go
src/github.com/hashicorp/consul/agent/sidecar_service_test.go
src/github.com/hashicorp/consul/agent/signal_unix.go
src/github.com/hashicorp/consul/agent/snapshot_endpoint.go
src/github.com/hashicorp/consul/agent/snapshot_endpoint_test.go
src/github.com/hashicorp/consul/agent/status_endpoint.go
src/github.com/hashicorp/consul/agent/status_endpoint_test.go
src/github.com/hashicorp/consul/agent/testagent.go
src/github.com/hashicorp/consul/agent/testagent_test.go
src/github.com/hashicorp/consul/agent/translate_addr.go
src/github.com/hashicorp/consul/agent/txn_endpoint.go
src/github.com/hashicorp/consul/agent/txn_endpoint_test.go
src/github.com/hashicorp/consul/agent/ui_endpoint.go
src/github.com/hashicorp/consul/agent/ui_endpoint_test.go
src/github.com/hashicorp/consul/agent/user_event.go
src/github.com/hashicorp/consul/agent/user_event_test.go
src/github.com/hashicorp/consul/agent/util.go
src/github.com/hashicorp/consul/agent/util_test.go
src/github.com/hashicorp/consul/agent/watch_handler.go
src/github.com/hashicorp/consul/agent/watch_handler_test.go
src/github.com/hashicorp/consul/agent/ae/ae.go
src/github.com/hashicorp/consul/agent/ae/ae_test.go
src/github.com/hashicorp/consul/agent/ae/trigger.go
src/github.com/hashicorp/consul/agent/cache/cache.go
Generating mock for: Request in file: /<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/mock_Request.go
Generating mock for: Type in file: /<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache/mock_Type.go
src/github.com/hashicorp/consul/agent/cache/cache_test.go
src/github.com/hashicorp/consul/agent/cache/entry.go
src/github.com/hashicorp/consul/agent/cache/entry_test.go
src/github.com/hashicorp/consul/agent/cache/mock_Request.go
src/github.com/hashicorp/consul/agent/cache/mock_Type.go
src/github.com/hashicorp/consul/agent/cache/request.go
src/github.com/hashicorp/consul/agent/cache/testing.go
src/github.com/hashicorp/consul/agent/cache/type.go
src/github.com/hashicorp/consul/agent/cache/watch.go
src/github.com/hashicorp/consul/agent/cache/watch_test.go
src/github.com/hashicorp/consul/agent/cache-types/catalog_services.go
src/github.com/hashicorp/consul/agent/cache-types/catalog_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_leaf.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_leaf_test.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root_test.go
src/github.com/hashicorp/consul/agent/cache-types/health_services.go
src/github.com/hashicorp/consul/agent/cache-types/health_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/intention_match.go
src/github.com/hashicorp/consul/agent/cache-types/intention_match_test.go
src/github.com/hashicorp/consul/agent/cache-types/mock_RPC.go
src/github.com/hashicorp/consul/agent/cache-types/node_services.go
src/github.com/hashicorp/consul/agent/cache-types/node_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/prepared_query.go
src/github.com/hashicorp/consul/agent/cache-types/prepared_query_test.go
src/github.com/hashicorp/consul/agent/cache-types/resolved_service_config.go
src/github.com/hashicorp/consul/agent/cache-types/resolved_service_config_test.go
src/github.com/hashicorp/consul/agent/cache-types/rpc.go
Generating mock for: RPC in file: /<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/cache-types/mock_RPC.go
src/github.com/hashicorp/consul/agent/cache-types/testing.go
src/github.com/hashicorp/consul/agent/checks/alias.go
src/github.com/hashicorp/consul/agent/checks/alias_test.go
src/github.com/hashicorp/consul/agent/checks/check.go
src/github.com/hashicorp/consul/agent/checks/check_test.go
src/github.com/hashicorp/consul/agent/checks/docker.go
src/github.com/hashicorp/consul/agent/checks/docker_unix.go
src/github.com/hashicorp/consul/agent/checks/grpc.go
src/github.com/hashicorp/consul/agent/checks/grpc_test.go
src/github.com/hashicorp/consul/agent/config/builder.go
src/github.com/hashicorp/consul/agent/config/config.go
src/github.com/hashicorp/consul/agent/config/default.go
src/github.com/hashicorp/consul/agent/config/default_oss.go
src/github.com/hashicorp/consul/agent/config/doc.go
src/github.com/hashicorp/consul/agent/config/flags.go
src/github.com/hashicorp/consul/agent/config/flags_test.go
src/github.com/hashicorp/consul/agent/config/flagset.go
src/github.com/hashicorp/consul/agent/config/merge.go
src/github.com/hashicorp/consul/agent/config/merge_test.go
src/github.com/hashicorp/consul/agent/config/patch_hcl.go
src/github.com/hashicorp/consul/agent/config/patch_hcl_test.go
src/github.com/hashicorp/consul/agent/config/runtime.go
src/github.com/hashicorp/consul/agent/config/runtime_test.go
src/github.com/hashicorp/consul/agent/config/segment_oss.go
src/github.com/hashicorp/consul/agent/config/segment_oss_test.go
src/github.com/hashicorp/consul/agent/connect/csr.go
src/github.com/hashicorp/consul/agent/connect/generate.go
src/github.com/hashicorp/consul/agent/connect/parsing.go
src/github.com/hashicorp/consul/agent/connect/testing_ca.go
src/github.com/hashicorp/consul/agent/connect/testing_ca_test.go
src/github.com/hashicorp/consul/agent/connect/testing_spiffe.go
src/github.com/hashicorp/consul/agent/connect/uri.go
src/github.com/hashicorp/consul/agent/connect/uri_agent.go
src/github.com/hashicorp/consul/agent/connect/uri_agent_test.go
src/github.com/hashicorp/consul/agent/connect/uri_service.go
src/github.com/hashicorp/consul/agent/connect/uri_service_test.go
src/github.com/hashicorp/consul/agent/connect/uri_signing.go
src/github.com/hashicorp/consul/agent/connect/uri_signing_test.go
src/github.com/hashicorp/consul/agent/connect/uri_test.go
src/github.com/hashicorp/consul/agent/connect/ca/mock_Provider.go
src/github.com/hashicorp/consul/agent/connect/ca/provider.go
Generating mock for: Provider in file: /<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/connect/ca/mock_Provider.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul_config.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul_test.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_vault.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_vault_test.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/client.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/plugin.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/plugin_test.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/provider.pb.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/serve.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/transport_grpc.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/transport_netrpc.go
src/github.com/hashicorp/consul/agent/consul/acl.go
src/github.com/hashicorp/consul/agent/consul/acl_authmethod.go
src/github.com/hashicorp/consul/agent/consul/acl_authmethod_test.go
src/github.com/hashicorp/consul/agent/consul/acl_client.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint_legacy.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/acl_replication.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_legacy.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_legacy_test.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_test.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_types.go
src/github.com/hashicorp/consul/agent/consul/acl_server.go
src/github.com/hashicorp/consul/agent/consul/acl_test.go
src/github.com/hashicorp/consul/agent/consul/acl_token_exp.go
src/github.com/hashicorp/consul/agent/consul/acl_token_exp_test.go
src/github.com/hashicorp/consul/agent/consul/auto_encrypt.go
src/github.com/hashicorp/consul/agent/consul/auto_encrypt_endpoint.go
src/github.com/hashicorp/consul/agent/consul/auto_encrypt_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot.go
src/github.com/hashicorp/consul/agent/consul/autopilot_oss.go
src/github.com/hashicorp/consul/agent/consul/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/catalog_endpoint.go
src/github.com/hashicorp/consul/agent/consul/catalog_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/client.go
src/github.com/hashicorp/consul/agent/consul/client_serf.go
src/github.com/hashicorp/consul/agent/consul/client_test.go
src/github.com/hashicorp/consul/agent/consul/config.go
src/github.com/hashicorp/consul/agent/consul/config_endpoint.go
src/github.com/hashicorp/consul/agent/consul/config_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/config_replication.go
src/github.com/hashicorp/consul/agent/consul/config_replication_test.go
src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go
src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/consul_ca_delegate.go
src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go
src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/enterprise_client_oss.go
src/github.com/hashicorp/consul/agent/consul/enterprise_server_oss.go
src/github.com/hashicorp/consul/agent/consul/filter.go
src/github.com/hashicorp/consul/agent/consul/filter_test.go
src/github.com/hashicorp/consul/agent/consul/flood.go
src/github.com/hashicorp/consul/agent/consul/health_endpoint.go
src/github.com/hashicorp/consul/agent/consul/health_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/helper_test.go
src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go
src/github.com/hashicorp/consul/agent/consul/intention_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/internal_endpoint.go
src/github.com/hashicorp/consul/agent/consul/internal_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/issue_test.go
src/github.com/hashicorp/consul/agent/consul/kvs_endpoint.go
src/github.com/hashicorp/consul/agent/consul/kvs_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/leader.go
src/github.com/hashicorp/consul/agent/consul/leader_oss.go
src/github.com/hashicorp/consul/agent/consul/leader_test.go
src/github.com/hashicorp/consul/agent/consul/merge.go
src/github.com/hashicorp/consul/agent/consul/merge_test.go
src/github.com/hashicorp/consul/agent/consul/operator_autopilot_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_autopilot_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/operator_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_raft_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_raft_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query_endpoint.go
src/github.com/hashicorp/consul/agent/consul/prepared_query_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/raft_rpc.go
src/github.com/hashicorp/consul/agent/consul/replication.go
src/github.com/hashicorp/consul/agent/consul/rpc.go
src/github.com/hashicorp/consul/agent/consul/rpc_test.go
src/github.com/hashicorp/consul/agent/consul/rtt.go
src/github.com/hashicorp/consul/agent/consul/rtt_test.go
src/github.com/hashicorp/consul/agent/consul/segment_oss.go
src/github.com/hashicorp/consul/agent/consul/serf_test.go
src/github.com/hashicorp/consul/agent/consul/server.go
src/github.com/hashicorp/consul/agent/consul/server_lookup.go
src/github.com/hashicorp/consul/agent/consul/server_lookup_test.go
src/github.com/hashicorp/consul/agent/consul/server_oss.go
src/github.com/hashicorp/consul/agent/consul/server_serf.go
src/github.com/hashicorp/consul/agent/consul/server_test.go
src/github.com/hashicorp/consul/agent/consul/session_endpoint.go
src/github.com/hashicorp/consul/agent/consul/session_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/session_timers.go
src/github.com/hashicorp/consul/agent/consul/session_timers_test.go
src/github.com/hashicorp/consul/agent/consul/session_ttl.go
src/github.com/hashicorp/consul/agent/consul/session_ttl_test.go
src/github.com/hashicorp/consul/agent/consul/snapshot_endpoint.go
src/github.com/hashicorp/consul/agent/consul/snapshot_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/stats_fetcher.go
src/github.com/hashicorp/consul/agent/consul/stats_fetcher_test.go
src/github.com/hashicorp/consul/agent/consul/status_endpoint.go
src/github.com/hashicorp/consul/agent/consul/status_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/txn_endpoint.go
src/github.com/hashicorp/consul/agent/consul/txn_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/util.go
src/github.com/hashicorp/consul/agent/consul/util_test.go
src/github.com/hashicorp/consul/agent/consul/authmethod/authmethods.go
src/github.com/hashicorp/consul/agent/consul/authmethod/kubeauth/k8s.go
src/github.com/hashicorp/consul/agent/consul/authmethod/kubeauth/k8s_test.go
src/github.com/hashicorp/consul/agent/consul/authmethod/kubeauth/testing.go
src/github.com/hashicorp/consul/agent/consul/authmethod/testauth/testing.go
src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go
src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot/promotion.go
src/github.com/hashicorp/consul/agent/consul/autopilot/promotion_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot/structs.go
src/github.com/hashicorp/consul/agent/consul/autopilot/structs_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/commands_oss.go
src/github.com/hashicorp/consul/agent/consul/fsm/commands_oss_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/fsm.go
src/github.com/hashicorp/consul/agent/consul/fsm/fsm_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot_oss.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot_oss_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/template.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/template_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/walk.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/walk_test.go
src/github.com/hashicorp/consul/agent/consul/state/acl.go
src/github.com/hashicorp/consul/agent/consul/state/acl_test.go
src/github.com/hashicorp/consul/agent/consul/state/autopilot.go
src/github.com/hashicorp/consul/agent/consul/state/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/state/catalog.go
src/github.com/hashicorp/consul/agent/consul/state/catalog_test.go
src/github.com/hashicorp/consul/agent/consul/state/config_entry.go
src/github.com/hashicorp/consul/agent/consul/state/config_entry_test.go
src/github.com/hashicorp/consul/agent/consul/state/connect_ca.go
src/github.com/hashicorp/consul/agent/consul/state/connect_ca_test.go
src/github.com/hashicorp/consul/agent/consul/state/coordinate.go
src/github.com/hashicorp/consul/agent/consul/state/coordinate_test.go
src/github.com/hashicorp/consul/agent/consul/state/delay.go
src/github.com/hashicorp/consul/agent/consul/state/delay_test.go
src/github.com/hashicorp/consul/agent/consul/state/graveyard.go
src/github.com/hashicorp/consul/agent/consul/state/graveyard_test.go
src/github.com/hashicorp/consul/agent/consul/state/index_connect.go
src/github.com/hashicorp/consul/agent/consul/state/index_connect_test.go
src/github.com/hashicorp/consul/agent/consul/state/intention.go
src/github.com/hashicorp/consul/agent/consul/state/intention_test.go
src/github.com/hashicorp/consul/agent/consul/state/kvs.go
src/github.com/hashicorp/consul/agent/consul/state/kvs_test.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_index.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_index_test.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_test.go
src/github.com/hashicorp/consul/agent/consul/state/schema.go
src/github.com/hashicorp/consul/agent/consul/state/schema_test.go
src/github.com/hashicorp/consul/agent/consul/state/session.go
src/github.com/hashicorp/consul/agent/consul/state/session_test.go
src/github.com/hashicorp/consul/agent/consul/state/state_store.go
src/github.com/hashicorp/consul/agent/consul/state/state_store_test.go
src/github.com/hashicorp/consul/agent/consul/state/tombstone_gc.go
src/github.com/hashicorp/consul/agent/consul/state/tombstone_gc_test.go
src/github.com/hashicorp/consul/agent/consul/state/txn.go
src/github.com/hashicorp/consul/agent/consul/state/txn_test.go
src/github.com/hashicorp/consul/agent/debug/host.go
src/github.com/hashicorp/consul/agent/debug/host_test.go
src/github.com/hashicorp/consul/agent/exec/exec.go
src/github.com/hashicorp/consul/agent/exec/exec_unix.go
src/github.com/hashicorp/consul/agent/local/state.go
src/github.com/hashicorp/consul/agent/local/testing.go
src/github.com/hashicorp/consul/agent/local/state_test.go
src/github.com/hashicorp/consul/agent/metadata/build.go
src/github.com/hashicorp/consul/agent/metadata/build_test.go
src/github.com/hashicorp/consul/agent/metadata/server.go
src/github.com/hashicorp/consul/agent/metadata/server_internal_test.go
src/github.com/hashicorp/consul/agent/metadata/server_test.go
src/github.com/hashicorp/consul/agent/mock/notify.go
src/github.com/hashicorp/consul/agent/pool/conn.go
src/github.com/hashicorp/consul/agent/pool/pool.go
src/github.com/hashicorp/consul/agent/proxycfg/manager.go
src/github.com/hashicorp/consul/agent/proxycfg/manager_test.go
src/github.com/hashicorp/consul/agent/proxycfg/proxycfg.go
src/github.com/hashicorp/consul/agent/proxycfg/snapshot.go
src/github.com/hashicorp/consul/agent/proxycfg/state.go
src/github.com/hashicorp/consul/agent/proxycfg/state_test.go
src/github.com/hashicorp/consul/agent/proxycfg/testing.go
src/github.com/hashicorp/consul/agent/proxyprocess/daemon.go
src/github.com/hashicorp/consul/agent/proxyprocess/daemon_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/exitstatus_syscall.go
src/github.com/hashicorp/consul/agent/proxyprocess/manager.go
src/github.com/hashicorp/consul/agent/proxyprocess/manager_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/noop.go
src/github.com/hashicorp/consul/agent/proxyprocess/noop_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/process.go
src/github.com/hashicorp/consul/agent/proxyprocess/process_unix.go
src/github.com/hashicorp/consul/agent/proxyprocess/proxy.go
src/github.com/hashicorp/consul/agent/proxyprocess/proxy_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/root.go
src/github.com/hashicorp/consul/agent/proxyprocess/snapshot.go
src/github.com/hashicorp/consul/agent/proxyprocess/test.go
src/github.com/hashicorp/consul/agent/router/manager.go
src/github.com/hashicorp/consul/agent/router/manager_internal_test.go
src/github.com/hashicorp/consul/agent/router/router.go
src/github.com/hashicorp/consul/agent/router/router_test.go
src/github.com/hashicorp/consul/agent/router/serf_adapter.go
src/github.com/hashicorp/consul/agent/router/serf_flooder.go
src/github.com/hashicorp/consul/agent/router/manager_test.go
src/github.com/hashicorp/consul/agent/structs/acl.go
src/github.com/hashicorp/consul/agent/structs/acl_cache.go
src/github.com/hashicorp/consul/agent/structs/acl_cache_test.go
src/github.com/hashicorp/consul/agent/structs/acl_legacy.go
src/github.com/hashicorp/consul/agent/structs/acl_legacy_test.go
src/github.com/hashicorp/consul/agent/structs/acl_test.go
src/github.com/hashicorp/consul/agent/structs/auto_encrypt.go
src/github.com/hashicorp/consul/agent/structs/catalog.go
src/github.com/hashicorp/consul/agent/structs/check_definition.go
src/github.com/hashicorp/consul/agent/structs/check_definition_test.go
src/github.com/hashicorp/consul/agent/structs/check_type.go
src/github.com/hashicorp/consul/agent/structs/config_entry.go
src/github.com/hashicorp/consul/agent/structs/config_entry_test.go
src/github.com/hashicorp/consul/agent/structs/connect.go
src/github.com/hashicorp/consul/agent/structs/connect_ca.go
src/github.com/hashicorp/consul/agent/structs/connect_ca_test.go
src/github.com/hashicorp/consul/agent/structs/connect_proxy_config.go
src/github.com/hashicorp/consul/agent/structs/connect_proxy_config_test.go
src/github.com/hashicorp/consul/agent/structs/connect_test.go
src/github.com/hashicorp/consul/agent/structs/errors.go
src/github.com/hashicorp/consul/agent/structs/intention.go
src/github.com/hashicorp/consul/agent/structs/intention_test.go
src/github.com/hashicorp/consul/agent/structs/operator.go
src/github.com/hashicorp/consul/agent/structs/prepared_query.go
src/github.com/hashicorp/consul/agent/structs/prepared_query_test.go
src/github.com/hashicorp/consul/agent/structs/sanitize_oss.go
src/github.com/hashicorp/consul/agent/structs/service_definition.go
src/github.com/hashicorp/consul/agent/structs/service_definition_test.go
src/github.com/hashicorp/consul/agent/structs/snapshot.go
src/github.com/hashicorp/consul/agent/structs/structs.go
src/github.com/hashicorp/consul/agent/structs/structs_filtering_test.go
src/github.com/hashicorp/consul/agent/structs/structs_test.go
src/github.com/hashicorp/consul/agent/structs/testing_catalog.go
src/github.com/hashicorp/consul/agent/structs/testing_connect_proxy_config.go
src/github.com/hashicorp/consul/agent/structs/testing_intention.go
src/github.com/hashicorp/consul/agent/structs/testing_service_definition.go
src/github.com/hashicorp/consul/agent/structs/txn.go
src/github.com/hashicorp/consul/agent/systemd/notify.go
src/github.com/hashicorp/consul/agent/token/store.go
src/github.com/hashicorp/consul/agent/token/store_test.go
src/github.com/hashicorp/consul/agent/xds/clusters.go
src/github.com/hashicorp/consul/agent/xds/clusters_test.go
src/github.com/hashicorp/consul/agent/xds/config.go
src/github.com/hashicorp/consul/agent/xds/config_test.go
src/github.com/hashicorp/consul/agent/xds/endpoints.go
src/github.com/hashicorp/consul/agent/xds/endpoints_test.go
src/github.com/hashicorp/consul/agent/xds/golden_test.go
src/github.com/hashicorp/consul/agent/xds/listeners.go
src/github.com/hashicorp/consul/agent/xds/listeners_test.go
src/github.com/hashicorp/consul/agent/xds/response.go
src/github.com/hashicorp/consul/agent/xds/routes.go
src/github.com/hashicorp/consul/agent/xds/server.go
src/github.com/hashicorp/consul/agent/xds/server_test.go
src/github.com/hashicorp/consul/agent/xds/testing.go
src/github.com/hashicorp/consul/agent/xds/xds.go
src/github.com/hashicorp/consul/api/acl.go
src/github.com/hashicorp/consul/api/acl_test.go
src/github.com/hashicorp/consul/api/agent.go
src/github.com/hashicorp/consul/api/agent_test.go
src/github.com/hashicorp/consul/api/api.go
src/github.com/hashicorp/consul/api/api_test.go
src/github.com/hashicorp/consul/api/catalog.go
src/github.com/hashicorp/consul/api/catalog_test.go
src/github.com/hashicorp/consul/api/config_entry.go
src/github.com/hashicorp/consul/api/config_entry_test.go
src/github.com/hashicorp/consul/api/connect.go
src/github.com/hashicorp/consul/api/connect_ca.go
src/github.com/hashicorp/consul/api/connect_ca_test.go
src/github.com/hashicorp/consul/api/connect_intention.go
src/github.com/hashicorp/consul/api/connect_intention_test.go
src/github.com/hashicorp/consul/api/coordinate.go
src/github.com/hashicorp/consul/api/coordinate_test.go
src/github.com/hashicorp/consul/api/debug.go
src/github.com/hashicorp/consul/api/debug_test.go
src/github.com/hashicorp/consul/api/event.go
src/github.com/hashicorp/consul/api/event_test.go
src/github.com/hashicorp/consul/api/health.go
src/github.com/hashicorp/consul/api/health_test.go
src/github.com/hashicorp/consul/api/kv.go
src/github.com/hashicorp/consul/api/kv_test.go
src/github.com/hashicorp/consul/api/lock.go
src/github.com/hashicorp/consul/api/lock_test.go
src/github.com/hashicorp/consul/api/operator.go
src/github.com/hashicorp/consul/api/operator_area.go
src/github.com/hashicorp/consul/api/operator_autopilot.go
src/github.com/hashicorp/consul/api/operator_autopilot_test.go
src/github.com/hashicorp/consul/api/operator_keyring.go
src/github.com/hashicorp/consul/api/operator_keyring_test.go
src/github.com/hashicorp/consul/api/operator_raft.go
src/github.com/hashicorp/consul/api/operator_raft_test.go
src/github.com/hashicorp/consul/api/operator_segment.go
src/github.com/hashicorp/consul/api/prepared_query.go
src/github.com/hashicorp/consul/api/prepared_query_test.go
src/github.com/hashicorp/consul/api/raw.go
src/github.com/hashicorp/consul/api/semaphore.go
src/github.com/hashicorp/consul/api/semaphore_test.go
src/github.com/hashicorp/consul/api/session.go
src/github.com/hashicorp/consul/api/session_test.go
src/github.com/hashicorp/consul/api/snapshot.go
src/github.com/hashicorp/consul/api/snapshot_test.go
src/github.com/hashicorp/consul/api/status.go
src/github.com/hashicorp/consul/api/status_test.go
src/github.com/hashicorp/consul/api/txn.go
src/github.com/hashicorp/consul/api/txn_test.go
src/github.com/hashicorp/consul/api/watch/funcs.go
src/github.com/hashicorp/consul/api/watch/plan.go
src/github.com/hashicorp/consul/api/watch/plan_test.go
src/github.com/hashicorp/consul/api/watch/watch.go
src/github.com/hashicorp/consul/api/watch/watch_test.go
src/github.com/hashicorp/consul/api/watch/funcs_test.go
src/github.com/hashicorp/consul/command/commands_oss.go
src/github.com/hashicorp/consul/command/registry.go
src/github.com/hashicorp/consul/command/acl/acl.go
src/github.com/hashicorp/consul/command/acl/acl_helpers.go
src/github.com/hashicorp/consul/command/acl/agenttokens/agent_tokens.go
src/github.com/hashicorp/consul/command/acl/agenttokens/agent_tokens_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/authmethod.go
src/github.com/hashicorp/consul/command/acl/authmethod/create/authmethod_create.go
src/github.com/hashicorp/consul/command/acl/authmethod/create/authmethod_create_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/delete/authmethod_delete.go
src/github.com/hashicorp/consul/command/acl/authmethod/delete/authmethod_delete_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/list/authmethod_list.go
src/github.com/hashicorp/consul/command/acl/authmethod/list/authmethod_list_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/read/authmethod_read.go
src/github.com/hashicorp/consul/command/acl/authmethod/read/authmethod_read_test.go
src/github.com/hashicorp/consul/command/acl/authmethod/update/authmethod_update.go
src/github.com/hashicorp/consul/command/acl/authmethod/update/authmethod_update_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/bindingrule.go
src/github.com/hashicorp/consul/command/acl/bindingrule/create/bindingrule_create.go
src/github.com/hashicorp/consul/command/acl/bindingrule/create/bindingrule_create_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/delete/bindingrule_delete.go
src/github.com/hashicorp/consul/command/acl/bindingrule/delete/bindingrule_delete_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/list/bindingrule_list.go
src/github.com/hashicorp/consul/command/acl/bindingrule/list/bindingrule_list_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/read/bindingrule_read.go
src/github.com/hashicorp/consul/command/acl/bindingrule/read/bindingrule_read_test.go
src/github.com/hashicorp/consul/command/acl/bindingrule/update/bindingrule_update.go
src/github.com/hashicorp/consul/command/acl/bindingrule/update/bindingrule_update_test.go
src/github.com/hashicorp/consul/command/acl/bootstrap/bootstrap.go
src/github.com/hashicorp/consul/command/acl/bootstrap/bootstrap_test.go
src/github.com/hashicorp/consul/command/acl/policy/policy.go
src/github.com/hashicorp/consul/command/acl/policy/create/policy_create.go
src/github.com/hashicorp/consul/command/acl/policy/create/policy_create_test.go
src/github.com/hashicorp/consul/command/acl/policy/delete/policy_delete.go
src/github.com/hashicorp/consul/command/acl/policy/delete/policy_delete_test.go
src/github.com/hashicorp/consul/command/acl/policy/list/policy_list.go
src/github.com/hashicorp/consul/command/acl/policy/list/policy_list_test.go
src/github.com/hashicorp/consul/command/acl/policy/read/policy_read.go
src/github.com/hashicorp/consul/command/acl/policy/read/policy_read_test.go
src/github.com/hashicorp/consul/command/acl/policy/update/policy_update.go
src/github.com/hashicorp/consul/command/acl/policy/update/policy_update_test.go
src/github.com/hashicorp/consul/command/acl/role/role.go
src/github.com/hashicorp/consul/command/acl/role/create/role_create.go
src/github.com/hashicorp/consul/command/acl/role/create/role_create_test.go
src/github.com/hashicorp/consul/command/acl/role/delete/role_delete.go
src/github.com/hashicorp/consul/command/acl/role/delete/role_delete_test.go
src/github.com/hashicorp/consul/command/acl/role/list/role_list.go
src/github.com/hashicorp/consul/command/acl/role/list/role_list_test.go
src/github.com/hashicorp/consul/command/acl/role/read/role_read.go
src/github.com/hashicorp/consul/command/acl/role/read/role_read_test.go
src/github.com/hashicorp/consul/command/acl/role/update/role_update.go
src/github.com/hashicorp/consul/command/acl/role/update/role_update_test.go
src/github.com/hashicorp/consul/command/acl/rules/translate.go
src/github.com/hashicorp/consul/command/acl/rules/translate_test.go
src/github.com/hashicorp/consul/command/acl/token/token.go
src/github.com/hashicorp/consul/command/acl/token/clone/token_clone.go
src/github.com/hashicorp/consul/command/acl/token/clone/token_clone_test.go
src/github.com/hashicorp/consul/command/acl/token/create/token_create.go
src/github.com/hashicorp/consul/command/acl/token/create/token_create_test.go
src/github.com/hashicorp/consul/command/acl/token/delete/token_delete.go
src/github.com/hashicorp/consul/command/acl/token/delete/token_delete_test.go
src/github.com/hashicorp/consul/command/acl/token/list/token_list.go
src/github.com/hashicorp/consul/command/acl/token/list/token_list_test.go
src/github.com/hashicorp/consul/command/acl/token/read/token_read.go
src/github.com/hashicorp/consul/command/acl/token/read/token_read_test.go
src/github.com/hashicorp/consul/command/acl/token/update/token_update.go
src/github.com/hashicorp/consul/command/acl/token/update/token_update_test.go
src/github.com/hashicorp/consul/command/agent/agent.go
src/github.com/hashicorp/consul/command/agent/agent_test.go
src/github.com/hashicorp/consul/command/catalog/catalog.go
src/github.com/hashicorp/consul/command/catalog/catalog_test.go
src/github.com/hashicorp/consul/command/catalog/list/dc/catalog_list_datacenters.go
src/github.com/hashicorp/consul/command/catalog/list/dc/catalog_list_datacenters_test.go
src/github.com/hashicorp/consul/command/catalog/list/nodes/catalog_list_nodes.go
src/github.com/hashicorp/consul/command/catalog/list/nodes/catalog_list_nodes_test.go
src/github.com/hashicorp/consul/command/catalog/list/services/catalog_list_services.go
src/github.com/hashicorp/consul/command/catalog/list/services/catalog_list_services_test.go
src/github.com/hashicorp/consul/command/config/config.go
src/github.com/hashicorp/consul/command/config/delete/config_delete.go
src/github.com/hashicorp/consul/command/config/delete/config_delete_test.go
src/github.com/hashicorp/consul/command/config/list/config_list.go
src/github.com/hashicorp/consul/command/config/list/config_list_test.go
src/github.com/hashicorp/consul/command/config/read/config_read.go
src/github.com/hashicorp/consul/command/config/read/config_read_test.go
src/github.com/hashicorp/consul/command/config/write/config_write.go
src/github.com/hashicorp/consul/command/config/write/config_write_test.go
src/github.com/hashicorp/consul/command/connect/connect.go
src/github.com/hashicorp/consul/command/connect/connect_test.go
src/github.com/hashicorp/consul/command/connect/ca/ca.go
src/github.com/hashicorp/consul/command/connect/ca/ca_test.go
src/github.com/hashicorp/consul/command/connect/ca/get/connect_ca_get.go
src/github.com/hashicorp/consul/command/connect/ca/get/connect_ca_get_test.go
src/github.com/hashicorp/consul/command/connect/ca/set/connect_ca_set.go
src/github.com/hashicorp/consul/command/connect/ca/set/connect_ca_set_test.go
src/github.com/hashicorp/consul/command/connect/envoy/bootstrap_config.go
src/github.com/hashicorp/consul/command/connect/envoy/bootstrap_config_test.go
src/github.com/hashicorp/consul/command/connect/envoy/bootstrap_tpl.go
src/github.com/hashicorp/consul/command/connect/envoy/envoy.go
src/github.com/hashicorp/consul/command/connect/envoy/envoy_test.go
src/github.com/hashicorp/consul/command/connect/envoy/exec_test.go
src/github.com/hashicorp/consul/command/connect/envoy/exec_unix.go
src/github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap/connect_envoy_pipe-bootstrap.go
src/github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap/connect_envoy_pipe-bootstrap_test.go
src/github.com/hashicorp/consul/command/connect/proxy/flag_upstreams.go
src/github.com/hashicorp/consul/command/connect/proxy/flag_upstreams_test.go
src/github.com/hashicorp/consul/command/connect/proxy/proxy.go
src/github.com/hashicorp/consul/command/connect/proxy/proxy_test.go
src/github.com/hashicorp/consul/command/connect/proxy/register.go
src/github.com/hashicorp/consul/command/connect/proxy/register_test.go
src/github.com/hashicorp/consul/command/debug/debug.go
src/github.com/hashicorp/consul/command/debug/debug_test.go
src/github.com/hashicorp/consul/command/event/event.go
src/github.com/hashicorp/consul/command/event/event_test.go
src/github.com/hashicorp/consul/command/exec/exec.go
src/github.com/hashicorp/consul/command/exec/exec_test.go
src/github.com/hashicorp/consul/command/flags/config.go
src/github.com/hashicorp/consul/command/flags/config_test.go
src/github.com/hashicorp/consul/command/flags/flag_map_value.go
src/github.com/hashicorp/consul/command/flags/flag_map_value_test.go
src/github.com/hashicorp/consul/command/flags/flag_slice_value.go
src/github.com/hashicorp/consul/command/flags/flag_slice_value_test.go
src/github.com/hashicorp/consul/command/flags/http.go
src/github.com/hashicorp/consul/command/flags/http_test.go
src/github.com/hashicorp/consul/command/flags/merge.go
src/github.com/hashicorp/consul/command/flags/usage.go
src/github.com/hashicorp/consul/command/forceleave/forceleave.go
src/github.com/hashicorp/consul/command/forceleave/forceleave_test.go
src/github.com/hashicorp/consul/command/helpers/helpers.go
src/github.com/hashicorp/consul/command/info/info.go
src/github.com/hashicorp/consul/command/info/info_test.go
src/github.com/hashicorp/consul/command/intention/intention.go
src/github.com/hashicorp/consul/command/intention/intention_test.go
src/github.com/hashicorp/consul/command/intention/check/check.go
src/github.com/hashicorp/consul/command/intention/check/check_test.go
src/github.com/hashicorp/consul/command/intention/create/create.go
src/github.com/hashicorp/consul/command/intention/create/create_test.go
src/github.com/hashicorp/consul/command/intention/delete/delete.go
src/github.com/hashicorp/consul/command/intention/delete/delete_test.go
src/github.com/hashicorp/consul/command/intention/finder/finder.go
src/github.com/hashicorp/consul/command/intention/finder/finder_test.go
src/github.com/hashicorp/consul/command/intention/get/get.go
src/github.com/hashicorp/consul/command/intention/get/get_test.go
src/github.com/hashicorp/consul/command/intention/match/match.go
src/github.com/hashicorp/consul/command/intention/match/match_test.go
src/github.com/hashicorp/consul/command/join/join.go
src/github.com/hashicorp/consul/command/join/join_test.go
src/github.com/hashicorp/consul/command/keygen/keygen.go
src/github.com/hashicorp/consul/command/keygen/keygen_test.go
src/github.com/hashicorp/consul/command/keyring/keyring.go
src/github.com/hashicorp/consul/command/keyring/keyring_test.go
src/github.com/hashicorp/consul/command/kv/kv.go
src/github.com/hashicorp/consul/command/kv/kv_test.go
src/github.com/hashicorp/consul/command/kv/del/kv_delete.go
src/github.com/hashicorp/consul/command/kv/del/kv_delete_test.go
src/github.com/hashicorp/consul/command/kv/exp/kv_export.go
src/github.com/hashicorp/consul/command/kv/exp/kv_export_test.go
src/github.com/hashicorp/consul/command/kv/get/kv_get.go
src/github.com/hashicorp/consul/command/kv/get/kv_get_test.go
src/github.com/hashicorp/consul/command/kv/imp/kv_import.go
src/github.com/hashicorp/consul/command/kv/imp/kv_import_test.go
src/github.com/hashicorp/consul/command/kv/impexp/kvimpexp.go
src/github.com/hashicorp/consul/command/kv/put/kv_put.go
src/github.com/hashicorp/consul/command/kv/put/kv_put_test.go
src/github.com/hashicorp/consul/command/leave/leave.go
src/github.com/hashicorp/consul/command/leave/leave_test.go
src/github.com/hashicorp/consul/command/lock/lock.go
src/github.com/hashicorp/consul/command/lock/lock_test.go
src/github.com/hashicorp/consul/command/lock/util_unix.go
src/github.com/hashicorp/consul/command/login/login.go
src/github.com/hashicorp/consul/command/login/login_test.go
src/github.com/hashicorp/consul/command/logout/logout.go
src/github.com/hashicorp/consul/command/logout/logout_test.go
src/github.com/hashicorp/consul/command/maint/maint.go
src/github.com/hashicorp/consul/command/maint/maint_test.go
src/github.com/hashicorp/consul/command/members/members.go
src/github.com/hashicorp/consul/command/members/members_test.go
src/github.com/hashicorp/consul/command/monitor/monitor.go
src/github.com/hashicorp/consul/command/monitor/monitor_test.go
src/github.com/hashicorp/consul/command/operator/operator.go
src/github.com/hashicorp/consul/command/operator/operator_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/operator_autopilot.go
src/github.com/hashicorp/consul/command/operator/autopilot/operator_autopilot_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/get/operator_autopilot_get.go
src/github.com/hashicorp/consul/command/operator/autopilot/get/operator_autopilot_get_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/set/operator_autopilot_set.go
src/github.com/hashicorp/consul/command/operator/autopilot/set/operator_autopilot_set_test.go
src/github.com/hashicorp/consul/command/operator/raft/operator_raft.go
src/github.com/hashicorp/consul/command/operator/raft/operator_raft_test.go
src/github.com/hashicorp/consul/command/operator/raft/listpeers/operator_raft_list.go
src/github.com/hashicorp/consul/command/operator/raft/listpeers/operator_raft_list_test.go
src/github.com/hashicorp/consul/command/operator/raft/removepeer/operator_raft_remove.go
src/github.com/hashicorp/consul/command/operator/raft/removepeer/operator_raft_remove_test.go
src/github.com/hashicorp/consul/command/reload/reload.go
src/github.com/hashicorp/consul/command/reload/reload_test.go
src/github.com/hashicorp/consul/command/rtt/rtt.go
src/github.com/hashicorp/consul/command/rtt/rtt_test.go
src/github.com/hashicorp/consul/command/services/config.go
src/github.com/hashicorp/consul/command/services/config_test.go
src/github.com/hashicorp/consul/command/services/services.go
src/github.com/hashicorp/consul/command/services/services_test.go
src/github.com/hashicorp/consul/command/services/deregister/deregister.go
src/github.com/hashicorp/consul/command/services/deregister/deregister_test.go
src/github.com/hashicorp/consul/command/services/register/register.go
src/github.com/hashicorp/consul/command/services/register/register_test.go
src/github.com/hashicorp/consul/command/snapshot/snapshot_command.go
src/github.com/hashicorp/consul/command/snapshot/snapshot_command_test.go
src/github.com/hashicorp/consul/command/snapshot/inspect/snapshot_inspect.go
src/github.com/hashicorp/consul/command/snapshot/inspect/snapshot_inspect_test.go
src/github.com/hashicorp/consul/command/snapshot/restore/snapshot_restore.go
src/github.com/hashicorp/consul/command/snapshot/restore/snapshot_restore_test.go
src/github.com/hashicorp/consul/command/snapshot/save/snapshot_save.go
src/github.com/hashicorp/consul/command/snapshot/save/snapshot_save_test.go
src/github.com/hashicorp/consul/command/tls/tls.go
src/github.com/hashicorp/consul/command/tls/tls_test.go
src/github.com/hashicorp/consul/command/tls/ca/tls_ca.go
src/github.com/hashicorp/consul/command/tls/ca/tls_ca_test.go
src/github.com/hashicorp/consul/command/tls/ca/create/tls_ca_create.go
src/github.com/hashicorp/consul/command/tls/ca/create/tls_ca_create_test.go
src/github.com/hashicorp/consul/command/tls/cert/tls_cert.go
src/github.com/hashicorp/consul/command/tls/cert/tls_cert_test.go
src/github.com/hashicorp/consul/command/tls/cert/create/tls_cert_create.go
src/github.com/hashicorp/consul/command/tls/cert/create/tls_cert_create_test.go
src/github.com/hashicorp/consul/command/validate/validate.go
src/github.com/hashicorp/consul/command/validate/validate_test.go
src/github.com/hashicorp/consul/command/version/version.go
src/github.com/hashicorp/consul/command/version/version_test.go
src/github.com/hashicorp/consul/command/watch/watch.go
src/github.com/hashicorp/consul/command/watch/watch_test.go
src/github.com/hashicorp/consul/connect/example_test.go
src/github.com/hashicorp/consul/connect/resolver.go
src/github.com/hashicorp/consul/connect/resolver_test.go
src/github.com/hashicorp/consul/connect/service.go
src/github.com/hashicorp/consul/connect/service_test.go
src/github.com/hashicorp/consul/connect/testing.go
src/github.com/hashicorp/consul/connect/tls.go
src/github.com/hashicorp/consul/connect/tls_test.go
src/github.com/hashicorp/consul/connect/certgen/certgen.go
src/github.com/hashicorp/consul/connect/proxy/config.go
src/github.com/hashicorp/consul/connect/proxy/config_test.go
src/github.com/hashicorp/consul/connect/proxy/conn.go
src/github.com/hashicorp/consul/connect/proxy/conn_test.go
src/github.com/hashicorp/consul/connect/proxy/listener.go
src/github.com/hashicorp/consul/connect/proxy/listener_test.go
src/github.com/hashicorp/consul/connect/proxy/proxy.go
src/github.com/hashicorp/consul/connect/proxy/proxy_test.go
src/github.com/hashicorp/consul/connect/proxy/testing.go
src/github.com/hashicorp/consul/ipaddr/detect.go
src/github.com/hashicorp/consul/ipaddr/detect_test.go
src/github.com/hashicorp/consul/ipaddr/ipaddr.go
src/github.com/hashicorp/consul/ipaddr/ipaddr_test.go
src/github.com/hashicorp/consul/lib/cluster.go
src/github.com/hashicorp/consul/lib/cluster_test.go
src/github.com/hashicorp/consul/lib/eof.go
src/github.com/hashicorp/consul/lib/map_walker.go
src/github.com/hashicorp/consul/lib/map_walker_test.go
src/github.com/hashicorp/consul/lib/math.go
src/github.com/hashicorp/consul/lib/path.go
src/github.com/hashicorp/consul/lib/rand.go
src/github.com/hashicorp/consul/lib/retry.go
src/github.com/hashicorp/consul/lib/retry_test.go
src/github.com/hashicorp/consul/lib/rtt.go
src/github.com/hashicorp/consul/lib/rtt_test.go
src/github.com/hashicorp/consul/lib/serf.go
src/github.com/hashicorp/consul/lib/stop_context.go
src/github.com/hashicorp/consul/lib/string.go
src/github.com/hashicorp/consul/lib/string_test.go
src/github.com/hashicorp/consul/lib/telemetry.go
src/github.com/hashicorp/consul/lib/telemetry_test.go
src/github.com/hashicorp/consul/lib/translate.go
src/github.com/hashicorp/consul/lib/translate_test.go
src/github.com/hashicorp/consul/lib/useragent.go
src/github.com/hashicorp/consul/lib/useragent_test.go
src/github.com/hashicorp/consul/lib/uuid.go
src/github.com/hashicorp/consul/lib/math_test.go
src/github.com/hashicorp/consul/lib/file/atomic.go
src/github.com/hashicorp/consul/lib/file/atomic_test.go
src/github.com/hashicorp/consul/lib/semaphore/semaphore.go
src/github.com/hashicorp/consul/lib/semaphore/semaphore_test.go
src/github.com/hashicorp/consul/logger/gated_writer.go
src/github.com/hashicorp/consul/logger/gated_writer_test.go
src/github.com/hashicorp/consul/logger/grpc.go
src/github.com/hashicorp/consul/logger/grpc_test.go
src/github.com/hashicorp/consul/logger/log_levels.go
src/github.com/hashicorp/consul/logger/log_writer.go
src/github.com/hashicorp/consul/logger/log_writer_test.go
src/github.com/hashicorp/consul/logger/logfile.go
src/github.com/hashicorp/consul/logger/logfile_test.go
src/github.com/hashicorp/consul/logger/logger.go
src/github.com/hashicorp/consul/logger/syslog.go
src/github.com/hashicorp/consul/sdk/freeport/freeport.go
src/github.com/hashicorp/consul/sdk/testutil/io.go
src/github.com/hashicorp/consul/sdk/testutil/server.go
src/github.com/hashicorp/consul/sdk/testutil/server_methods.go
src/github.com/hashicorp/consul/sdk/testutil/server_wrapper.go
src/github.com/hashicorp/consul/sdk/testutil/testlog.go
src/github.com/hashicorp/consul/sdk/testutil/retry/retry.go
src/github.com/hashicorp/consul/sdk/testutil/retry/retry_test.go
src/github.com/hashicorp/consul/sentinel/evaluator.go
src/github.com/hashicorp/consul/sentinel/scope.go
src/github.com/hashicorp/consul/sentinel/sentinel_oss.go
src/github.com/hashicorp/consul/service_os/service.go
src/github.com/hashicorp/consul/snapshot/archive.go
src/github.com/hashicorp/consul/snapshot/archive_test.go
src/github.com/hashicorp/consul/snapshot/snapshot.go
src/github.com/hashicorp/consul/snapshot/snapshot_test.go
src/github.com/hashicorp/consul/testrpc/wait.go
src/github.com/hashicorp/consul/tlsutil/config.go
src/github.com/hashicorp/consul/tlsutil/config_test.go
src/github.com/hashicorp/consul/tlsutil/generate.go
src/github.com/hashicorp/consul/tlsutil/generate_test.go
src/github.com/hashicorp/consul/types/area.go
src/github.com/hashicorp/consul/types/checks.go
src/github.com/hashicorp/consul/types/node_id.go
src/github.com/hashicorp/consul/version/version.go
	cd _build && go install -gcflags=all=\"-trimpath=/<<BUILDDIR>>/consul-1.5.2\+dfsg1/_build/src\" -asmflags=all=\"-trimpath=/<<BUILDDIR>>/consul-1.5.2\+dfsg1/_build/src\" -v -p 4 github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/cache github.com/hashicorp/consul/agent/cache-types github.com/hashicorp/consul/agent/checks github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/connect github.com/hashicorp/consul/agent/connect/ca github.com/hashicorp/consul/agent/connect/ca/plugin github.com/hashicorp/consul/agent/consul github.com/hashicorp/consul/agent/consul/authmethod github.com/hashicorp/consul/agent/consul/authmethod/kubeauth github.com/hashicorp/consul/agent/consul/authmethod/testauth github.com/hashicorp/consul/agent/consul/autopilot github.com/hashicorp/consul/agent/consul/fsm github.com/hashicorp/consul/agent/consul/prepared_query github.com/hashicorp/consul/agent/consul/state github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/api github.com/hashicorp/consul/api/watch github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/authmethod github.com/hashicorp/consul/command/acl/authmethod/create github.com/hashicorp/consul/command/acl/authmethod/delete github.com/hashicorp/consul/command/acl/authmethod/list github.com/hashicorp/consul/command/acl/authmethod/read github.com/hashicorp/consul/command/acl/authmethod/update github.com/hashicorp/consul/command/acl/bindingrule github.com/hashicorp/consul/command/acl/bindingrule/create github.com/hashicorp/consul/command/acl/bindingrule/delete github.com/hashicorp/consul/command/acl/bindingrule/list github.com/hashicorp/consul/command/acl/bindingrule/read github.com/hashicorp/consul/command/acl/bindingrule/update github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/role github.com/hashicorp/consul/command/acl/role/create github.com/hashicorp/consul/command/acl/role/delete github.com/hashicorp/consul/command/acl/role/list github.com/hashicorp/consul/command/acl/role/read github.com/hashicorp/consul/command/acl/role/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/config github.com/hashicorp/consul/command/config/delete github.com/hashicorp/consul/command/config/list github.com/hashicorp/consul/command/config/read github.com/hashicorp/consul/command/config/write github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/login github.com/hashicorp/consul/command/logout github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/tls github.com/hashicorp/consul/command/tls/ca github.com/hashicorp/consul/command/tls/ca/create github.com/hashicorp/consul/command/tls/cert github.com/hashicorp/consul/command/tls/cert/create github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sdk/freeport github.com/hashicorp/consul/sdk/testutil github.com/hashicorp/consul/sdk/testutil/retry github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version
internal/cpu
unicode/utf8
runtime/internal/sys
math/bits
runtime/internal/math
internal/bytealg
internal/race
runtime/internal/atomic
sync/atomic
math
internal/testlog
unicode
encoding
unicode/utf16
container/list
runtime
crypto/internal/subtle
crypto/subtle
vendor/golang.org/x/crypto/cryptobyte/asn1
internal/nettrace
runtime/cgo
vendor/golang.org/x/crypto/internal/subtle
github.com/circonus-labs/circonus-gometrics/api/config
golang.org/x/net/internal/iana
github.com/hashicorp/consul/types
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/selection
github.com/hashicorp/consul/vendor/k8s.io/client-go/util/integer
github.com/aws/aws-sdk-go/aws/client/metadata
go.opencensus.io/trace/internal
go.opencensus.io
go.opencensus.io/internal/tagencoding
github.com/hashicorp/consul/service_os
github.com/hashicorp/consul/vendor/github.com/oklog/run
internal/reflectlite
sync
internal/singleflight
google.golang.org/grpc/internal/grpcsync
math/rand
github.com/hashicorp/consul/agent/token
golang.org/x/sync/singleflight
errors
sort
io
internal/oserror
strconv
syscall
vendor/golang.org/x/net/dns/dnsmessage
bytes
strings
reflect
bufio
github.com/armon/go-radix
hash
crypto/internal/randutil
time
internal/syscall/unix
crypto
crypto/hmac
crypto/rc4
vendor/golang.org/x/crypto/hkdf
hash/crc32
vendor/golang.org/x/text/transform
path
github.com/hashicorp/hcl/hcl/strconv
github.com/hashicorp/golang-lru/simplelru
regexp/syntax
github.com/hashicorp/go-immutable-radix
text/tabwriter
container/heap
github.com/beorn7/perks/quantile
github.com/prometheus/common/internal/bitbucket.org/ww/goautoneg
html
internal/poll
context
regexp
encoding/base32
hash/crc64
hash/fnv
os
github.com/kr/text
internal/fmtsort
encoding/binary
github.com/hashicorp/errwrap
github.com/mitchellh/reflectwalk
github.com/posener/complete/match
google.golang.org/grpc/internal/grpcrand
google.golang.org/grpc/encoding
google.golang.org/grpc/internal/backoff
github.com/mitchellh/copystructure
golang.org/x/text/transform
encoding/base64
crypto/cipher
fmt
crypto/sha512
crypto/ed25519/internal/edwards25519
crypto/md5
crypto/aes
crypto/des
crypto/sha1
crypto/sha256
encoding/pem
path/filepath
net
vendor/golang.org/x/crypto/internal/chacha20
io/ioutil
vendor/golang.org/x/crypto/poly1305
encoding/json
math/big
encoding/hex
net/url
vendor/golang.org/x/crypto/chacha20poly1305
vendor/golang.org/x/crypto/curve25519
compress/flate
log
vendor/golang.org/x/text/unicode/bidi
compress/gzip
vendor/golang.org/x/text/secure/bidirule
vendor/golang.org/x/text/unicode/norm
crypto/elliptic
encoding/asn1
crypto/rand
crypto/ecdsa
crypto/ed25519
crypto/rsa
crypto/dsa
crypto/x509/pkix
vendor/golang.org/x/crypto/cryptobyte
vendor/golang.org/x/net/idna
vendor/golang.org/x/net/http2/hpack
mime
mime/quotedprintable
net/http/internal
os/signal
github.com/hashicorp/hcl/hcl/token
golang.org/x/crypto/blake2b
github.com/hashicorp/hcl/hcl/ast
github.com/hashicorp/hcl/hcl/scanner
github.com/hashicorp/hcl/json/token
github.com/pkg/errors
github.com/hashicorp/hcl/json/scanner
github.com/circonus-labs/circonusllhist
github.com/hashicorp/hcl/hcl/parser
github.com/hashicorp/hcl/json/parser
github.com/cespare/xxhash
github.com/hashicorp/hcl/hcl/printer
github.com/hashicorp/hcl
github.com/golang/protobuf/proto
github.com/prometheus/common/model
github.com/prometheus/procfs/internal/fs
runtime/debug
github.com/hashicorp/consul/version
github.com/hashicorp/go-uuid
encoding/gob
crypto/x509
net/textproto
vendor/golang.org/x/net/http/httpguts
vendor/golang.org/x/net/http/httpproxy
mime/multipart
github.com/mitchellh/mapstructure
github.com/DataDog/datadog-go/statsd
crypto/tls
github.com/prometheus/procfs
go/token
text/template/parse
compress/lzw
github.com/google/btree
text/template
github.com/hashicorp/go-multierror
os/exec
github.com/prometheus/client_model/go
github.com/prometheus/client_golang/prometheus/internal
github.com/matttproud/golang_protobuf_extensions/pbutil
golang.org/x/crypto/ed25519
golang.org/x/net/bpf
github.com/hashicorp/go-sockaddr
golang.org/x/sys/unix
html/template
net/http/httptrace
github.com/hashicorp/go-rootcerts
net/http
text/scanner
github.com/hashicorp/memberlist/vendor/github.com/sean-/seed
github.com/hashicorp/yamux
github.com/mitchellh/go-testing-interface
github.com/davecgh/go-spew/spew
golang.org/x/net/internal/socket
github.com/pmezard/go-difflib/difflib
github.com/stretchr/objx
gopkg.in/yaml.v2
golang.org/x/net/ipv4
golang.org/x/net/ipv6
flag
github.com/hashicorp/go-version
github.com/mattn/go-isatty
github.com/mattn/go-colorable
github.com/fatih/color
github.com/miekg/dns
github.com/hashicorp/go-hclog
runtime/trace
testing
github.com/hashicorp/golang-lru
github.com/mitchellh/hashstructure
github.com/bgentry/speakeasy
os/user
github.com/hashicorp/consul/command/helpers
github.com/armon/circbuf
golang.org/x/net/internal/socks
golang.org/x/net/proxy
github.com/hashicorp/consul/agent/exec
golang.org/x/net/internal/timeseries
google.golang.org/grpc/grpclog
google.golang.org/grpc/connectivity
google.golang.org/grpc/credentials/internal
google.golang.org/grpc/credentials
google.golang.org/grpc/internal
google.golang.org/grpc/metadata
google.golang.org/grpc/serviceconfig
google.golang.org/grpc/resolver
google.golang.org/grpc/balancer
google.golang.org/grpc/balancer/base
google.golang.org/grpc/balancer/roundrobin
google.golang.org/grpc/codes
google.golang.org/grpc/encoding/proto
google.golang.org/grpc/internal/balancerload
github.com/golang/protobuf/ptypes/any
github.com/golang/protobuf/ptypes/duration
github.com/golang/protobuf/ptypes/timestamp
github.com/posener/complete/cmd/install
github.com/golang/protobuf/ptypes
google.golang.org/grpc/binarylog/grpc_binarylog_v1
github.com/posener/complete/cmd
github.com/posener/complete
github.com/hashicorp/go-cleanhttp
github.com/armon/go-metrics
github.com/hashicorp/go-retryablehttp
github.com/tv42/httpunix
expvar
github.com/circonus-labs/circonus-gometrics/api
github.com/prometheus/common/expfmt
github.com/hashicorp/serf/coordinate
github.com/hashicorp/consul/api
github.com/armon/go-metrics/datadog
github.com/prometheus/client_golang/prometheus
github.com/circonus-labs/circonus-gometrics/checkmgr
github.com/circonus-labs/circonus-gometrics
net/rpc
github.com/armon/go-metrics/circonus
github.com/hashicorp/go-msgpack/codec
net/http/httptest
github.com/armon/go-metrics/prometheus
github.com/mitchellh/cli
github.com/stretchr/testify/assert
github.com/hashicorp/consul/sentinel
github.com/hashicorp/consul/acl
github.com/hashicorp/consul/command/flags
github.com/hashicorp/consul/command/acl/agenttokens
github.com/hashicorp/consul/command/acl/authmethod
github.com/hashicorp/consul/command/acl/authmethod/delete
github.com/hashicorp/consul/command/acl/bindingrule
github.com/hashicorp/consul/command/acl/policy
github.com/hashicorp/consul/command/acl/role
github.com/hashicorp/consul/command/acl/token
github.com/NYTimes/gziphandler
github.com/stretchr/testify/mock
github.com/hashicorp/memberlist
github.com/stretchr/testify/require
github.com/hashicorp/raft
github.com/hashicorp/consul/vendor/github.com/coredns/coredns/plugin/pkg/dnsutil
github.com/elazarl/go-bindata-assetfs
github.com/docker/go-connections/sockets
golang.org/x/net/trace
google.golang.org/genproto/googleapis/rpc/status
google.golang.org/grpc/status
google.golang.org/grpc/internal/binarylog
github.com/hashicorp/serf/serf
google.golang.org/grpc/internal/channelz
google.golang.org/grpc/internal/envconfig
golang.org/x/text/unicode/bidi
golang.org/x/text/unicode/norm
golang.org/x/text/secure/bidirule
golang.org/x/net/http2/hpack
google.golang.org/grpc/internal/syscall
google.golang.org/grpc/keepalive
google.golang.org/grpc/peer
google.golang.org/grpc/stats
github.com/hashicorp/consul/lib
google.golang.org/grpc/tap
google.golang.org/grpc/naming
google.golang.org/grpc/resolver/dns
github.com/hashicorp/consul/agent/consul/autopilot
google.golang.org/grpc/resolver/passthrough
github.com/hashicorp/consul/agent/cache
github.com/hashicorp/consul/agent/ae
golang.org/x/net/idna
net/http/httputil
golang.org/x/net/context
github.com/hashicorp/hil/ast
github.com/hashicorp/hil
github.com/hashicorp/consul/agent/structs
golang.org/x/net/http/httpguts
github.com/hashicorp/go-memdb
golang.org/x/net/http2
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/hclutil
github.com/golang/snappy
github.com/ryanuber/go-glob
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/strutil
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/parseutil
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/compressutil
golang.org/x/time/rate
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/jsonutil
golang.org/x/crypto/pbkdf2
gopkg.in/square/go-jose.v2/cipher
gopkg.in/square/go-jose.v2/json
github.com/gogo/protobuf/proto
github.com/hashicorp/consul/command/acl
github.com/hashicorp/consul/agent/connect
github.com/hashicorp/consul/command/acl/authmethod/create
github.com/hashicorp/consul/command/acl/authmethod/list
github.com/hashicorp/consul/command/acl/authmethod/read
github.com/hashicorp/consul/command/acl/authmethod/update
github.com/hashicorp/consul/command/acl/bindingrule/create
github.com/hashicorp/consul/command/acl/bindingrule/delete
github.com/hashicorp/consul/command/acl/bindingrule/list
github.com/hashicorp/consul/command/acl/bindingrule/read
github.com/hashicorp/consul/command/acl/bindingrule/update
github.com/hashicorp/consul/command/acl/bootstrap
github.com/hashicorp/consul/command/acl/policy/create
github.com/hashicorp/consul/command/acl/policy/delete
github.com/hashicorp/consul/command/acl/policy/list
github.com/hashicorp/consul/command/acl/policy/read
github.com/hashicorp/consul/command/acl/policy/update
github.com/hashicorp/consul/command/acl/role/create
github.com/hashicorp/consul/command/acl/role/delete
github.com/hashicorp/consul/command/acl/role/list
github.com/hashicorp/consul/command/acl/role/read
github.com/hashicorp/consul/command/acl/role/update
github.com/hashicorp/consul/command/acl/rules
github.com/hashicorp/consul/command/acl/token/clone
github.com/hashicorp/consul/command/acl/token/create
github.com/hashicorp/consul/command/acl/token/delete
github.com/hashicorp/consul/command/acl/token/list
github.com/hashicorp/consul/command/acl/token/read
github.com/hashicorp/consul/command/acl/token/update
google.golang.org/grpc/internal/transport
github.com/hashicorp/consul/agent/consul/prepared_query
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/api
github.com/hashicorp/consul/agent/consul/state
github.com/hashicorp/consul/agent/consul/authmethod
gopkg.in/square/go-jose.v2
google.golang.org/grpc
github.com/hashicorp/consul/agent/connect/ca
gopkg.in/square/go-jose.v2/jwt
github.com/gogo/protobuf/sortkeys
github.com/google/gofuzz
gopkg.in/inf.v0
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/third_party/forked/golang/reflect
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/fields
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/api/resource
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/conversion
github.com/golang/glog
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/sets
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/errors
go/scanner
internal/lazyregexp
google.golang.org/grpc/health/grpc_health_v1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/conversion/queryparams
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/schema
go/ast
github.com/hashicorp/consul/agent/checks
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/validation/field
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/json
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/runtime
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/validation
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/types
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/intstr
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/labels
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/net
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/wait
github.com/googleapis/gnostic/extensions
github.com/googleapis/gnostic/compiler
github.com/gregjones/httpcache
hash/adler32
compress/zlib
github.com/googleapis/gnostic/OpenAPIv2
github.com/peterbourgon/diskv
go/doc
go/parser
github.com/gregjones/httpcache/diskcache
github.com/ghodss/yaml
github.com/modern-go/concurrent
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/framer
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/yaml
github.com/modern-go/reflect2
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/version
github.com/hashicorp/consul/vendor/k8s.io/client-go/pkg/version
golang.org/x/crypto/ssh/terminal
github.com/hashicorp/consul/vendor/k8s.io/client-go/transport
github.com/hashicorp/consul/vendor/k8s.io/client-go/util/connrotation
github.com/hashicorp/consul/vendor/k8s.io/client-go/tools/metrics
github.com/hashicorp/consul/vendor/k8s.io/client-go/util/cert
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/util/clock
github.com/hashicorp/consul/vendor/k8s.io/client-go/util/flowcontrol
github.com/hashicorp/consul/agent/consul/fsm
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/watch
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/apis/meta/v1
github.com/json-iterator/go
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/recognizer
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/protobuf
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/streaming
github.com/hashicorp/consul/vendor/k8s.io/client-go/tools/clientcmd/api
github.com/hashicorp/consul/agent/metadata
github.com/hashicorp/consul/tlsutil
github.com/hashicorp/net-rpc-msgpackrpc
github.com/hashicorp/consul/agent/pool
github.com/hashicorp/consul/agent/router
github.com/hashicorp/consul/ipaddr
github.com/hashicorp/consul/lib/semaphore
archive/tar
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json
github.com/hashicorp/consul/vendor/k8s.io/api/authentication/v1
github.com/hashicorp/consul/vendor/github.com/hashicorp/go-bexpr
github.com/hashicorp/consul/vendor/k8s.io/api/core/v1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/api/errors
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/apis/meta/v1/unstructured
github.com/hashicorp/consul/vendor/k8s.io/api/admissionregistration/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning
github.com/hashicorp/consul/vendor/k8s.io/api/admissionregistration/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/authentication/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/runtime/serializer
github.com/hashicorp/consul/vendor/k8s.io/api/authorization/v1
github.com/hashicorp/consul/vendor/k8s.io/api/authorization/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/certificates/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/rbac/v1
github.com/hashicorp/consul/vendor/k8s.io/api/rbac/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/rbac/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/scheduling/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/scheduling/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/storage/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/pkg/apis/clientauthentication
github.com/hashicorp/consul/vendor/k8s.io/client-go/rest/watch
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/apis/meta/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/pkg/apis/clientauthentication/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/pkg/apis/clientauthentication/v1beta1
github.com/hashicorp/consul/snapshot
github.com/hashicorp/consul/vendor/k8s.io/client-go/plugin/pkg/client/auth/exec
github.com/hashicorp/consul/vendor/k8s.io/apimachinery/pkg/api/meta
github.com/boltdb/bolt
github.com/hashicorp/consul/vendor/k8s.io/client-go/rest
github.com/hashicorp/go-sockaddr/template
github.com/shirou/gopsutil/internal/common
github.com/hashicorp/raft-boltdb
github.com/hashicorp/consul/agent/local
github.com/hashicorp/consul/lib/file
github.com/hashicorp/consul/agent/systemd
github.com/golang/protobuf/protoc-gen-go/descriptor
github.com/hashicorp/consul/agent/proxyprocess
github.com/shirou/gopsutil/cpu
github.com/shirou/gopsutil/disk
github.com/shirou/gopsutil/host
github.com/shirou/gopsutil/mem
github.com/hashicorp/consul/vendor/github.com/envoyproxy/protoc-gen-validate/validate
github.com/gogo/protobuf/protoc-gen-gogo/descriptor
github.com/hashicorp/consul/agent/debug
github.com/gogo/protobuf/types
net/mail
github.com/hashicorp/consul/api/watch
github.com/gogo/protobuf/gogoproto
github.com/gogo/googleapis/google/api
log/syslog
github.com/hashicorp/go-syslog
github.com/hashicorp/logutils
github.com/hashicorp/consul/logger
github.com/hashicorp/consul/sdk/freeport
github.com/hashicorp/consul/sdk/testutil/retry
encoding/xml
github.com/denverdino/aliyungo/util
github.com/aws/aws-sdk-go/aws/awserr
github.com/aws/aws-sdk-go/internal/ini
github.com/denverdino/aliyungo/common
github.com/aws/aws-sdk-go/internal/shareddefaults
github.com/aws/aws-sdk-go/aws/credentials
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/type
github.com/gogo/googleapis/google/rpc
github.com/gogo/protobuf/jsonpb
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/core
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/pkg/util
github.com/denverdino/aliyungo/ecs
github.com/aws/aws-sdk-go/aws/endpoints
github.com/hashicorp/go-discover/provider/aliyun
github.com/aws/aws-sdk-go/internal/sdkio
github.com/jmespath/go-jmespath
github.com/aws/aws-sdk-go/aws/awsutil
github.com/aws/aws-sdk-go/internal/sdkrand
github.com/aws/aws-sdk-go/internal/sdkuri
github.com/aws/aws-sdk-go/aws/credentials/processcreds
golang.org/x/net/context/ctxhttp
golang.org/x/oauth2/internal
golang.org/x/oauth2
github.com/aws/aws-sdk-go/aws
cloud.google.com/go/compute/metadata
golang.org/x/oauth2/jws
github.com/aws/aws-sdk-go/aws/request
golang.org/x/oauth2/jwt
golang.org/x/oauth2/google
google.golang.org/api/googleapi/internal/uritemplates
google.golang.org/api/googleapi
github.com/hashicorp/consul/vendor/k8s.io/api/apps/v1
github.com/hashicorp/consul/vendor/k8s.io/api/apps/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/apps/v1beta2
github.com/hashicorp/consul/vendor/k8s.io/api/autoscaling/v1
github.com/hashicorp/consul/vendor/k8s.io/api/autoscaling/v2beta1
github.com/hashicorp/consul/vendor/k8s.io/api/batch/v1
github.com/hashicorp/consul/vendor/k8s.io/api/events/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/batch/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/batch/v2alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/extensions/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/networking/v1
github.com/hashicorp/consul/vendor/k8s.io/api/policy/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/api/settings/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/api/storage/v1
github.com/hashicorp/consul/vendor/k8s.io/api/storage/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/tools/reference
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/auth
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/cluster
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/endpoint
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/route
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/listener
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/network/ext_authz/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/service/auth/v2
github.com/aws/aws-sdk-go/aws/corehandlers
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/scheme
github.com/aws/aws-sdk-go/aws/client
github.com/hashicorp/consul/vendor/k8s.io/client-go/discovery
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/admissionregistration/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/admissionregistration/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/apps/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/apps/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/apps/v1beta2
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/authentication/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/authentication/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/authorization/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/authorization/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/autoscaling/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/autoscaling/v2beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/batch/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/batch/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/batch/v2alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/certificates/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/core/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/events/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/extensions/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/networking/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/policy/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/rbac/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/rbac/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/rbac/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/scheduling/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/scheduling/v1beta1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/settings/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/storage/v1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/storage/v1alpha1
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes/typed/storage/v1beta1
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/accesslog/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/service/auth/v2alpha
github.com/aws/aws-sdk-go/aws/ec2metadata
github.com/hashicorp/consul/vendor/k8s.io/client-go/kubernetes
github.com/aws/aws-sdk-go/aws/credentials/ec2rolecreds
github.com/aws/aws-sdk-go/private/protocol
github.com/aws/aws-sdk-go/private/protocol/json/jsonutil
github.com/hashicorp/consul/agent/consul/authmethod/kubeauth
github.com/aws/aws-sdk-go/aws/credentials/endpointcreds
github.com/aws/aws-sdk-go/aws/defaults
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/network/tcp_proxy/v2
github.com/aws/aws-sdk-go/private/protocol/rest
github.com/hashicorp/consul/agent/consul
github.com/aws/aws-sdk-go/aws/signer/v4
github.com/aws/aws-sdk-go/private/protocol/query/queryutil
github.com/aws/aws-sdk-go/private/protocol/xml/xmlutil
github.com/aws/aws-sdk-go/aws/csm
google.golang.org/api/gensupport
github.com/aws/aws-sdk-go/private/protocol/query
github.com/aws/aws-sdk-go/service/sts
github.com/aws/aws-sdk-go/private/protocol/ec2query
github.com/aws/aws-sdk-go/service/ec2
github.com/aws/aws-sdk-go/service/sts/stsiface
github.com/aws/aws-sdk-go/aws/credentials/stscreds
github.com/aws/aws-sdk-go/aws/session
google.golang.org/api/internal
google.golang.org/api/option
go.opencensus.io/internal
go.opencensus.io/trace/tracestate
go.opencensus.io/trace
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/network/http_connection_manager/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/service/discovery/v2
go.opencensus.io/trace/propagation
go.opencensus.io/plugin/ochttp/propagation/b3
go.opencensus.io/resource
go.opencensus.io/metric/metricdata
runtime/pprof
github.com/hashicorp/consul/agent/cache-types
github.com/hashicorp/consul/agent/config
github.com/hashicorp/consul/agent/proxycfg
go.opencensus.io/tag
github.com/hashicorp/consul/agent/xds
go.opencensus.io/stats/internal
go.opencensus.io/stats
go.opencensus.io/metric/metricproducer
go.opencensus.io/stats/view
google.golang.org/api/googleapi/transport
google.golang.org/api/transport/http/internal/propagation
go.opencensus.io/plugin/ochttp
github.com/hashicorp/mdns
github.com/hashicorp/go-discover/provider/mdns
github.com/gophercloud/gophercloud
github.com/packethost/packngo
google.golang.org/api/transport/http
google.golang.org/api/compute/v1
github.com/gophercloud/gophercloud/pagination
github.com/gophercloud/gophercloud/openstack/identity/v2/tenants
github.com/gophercloud/gophercloud/openstack/identity/v3/tokens
github.com/gophercloud/gophercloud/openstack/identity/v2/tokens
github.com/gophercloud/gophercloud/openstack/utils
github.com/gophercloud/gophercloud/openstack/compute/v2/flavors
github.com/gophercloud/gophercloud/openstack
github.com/gophercloud/gophercloud/openstack/compute/v2/images
github.com/hashicorp/go-discover/provider/packet
github.com/gophercloud/gophercloud/openstack/compute/v2/servers
github.com/imdario/mergo
github.com/prometheus/client_golang/prometheus/promhttp
github.com/hashicorp/go-discover/provider/os
net/http/pprof
github.com/hashicorp/go-checkpoint
github.com/hashicorp/consul/command/catalog
github.com/hashicorp/consul/command/catalog/list/dc
github.com/ryanuber/columnize
github.com/hashicorp/consul/command/catalog/list/services
github.com/hashicorp/consul/command/catalog/list/nodes
github.com/hashicorp/consul/command/config
github.com/hashicorp/consul/command/config/delete
github.com/hashicorp/consul/command/config/list
github.com/hashicorp/consul/command/config/read
github.com/hashicorp/consul/command/config/write
github.com/hashicorp/consul/command/connect
github.com/hashicorp/consul/command/connect/ca
github.com/hashicorp/consul/command/connect/ca/get
github.com/hashicorp/consul/command/connect/ca/set
github.com/hashicorp/consul/connect
github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap
github.com/hashicorp/consul/command/debug
github.com/hashicorp/consul/connect/proxy
github.com/hashicorp/consul/command/event
github.com/hashicorp/consul/command/exec
github.com/hashicorp/consul/command/connect/proxy
github.com/hashicorp/consul/command/forceleave
github.com/hashicorp/consul/command/info
github.com/hashicorp/consul/command/connect/envoy
github.com/hashicorp/consul/command/intention
github.com/hashicorp/consul/command/intention/check
github.com/hashicorp/consul/command/intention/finder
github.com/hashicorp/consul/command/intention/create
github.com/hashicorp/consul/command/intention/delete
github.com/hashicorp/consul/command/intention/get
github.com/hashicorp/consul/command/intention/match
github.com/hashicorp/consul/command/join
github.com/hashicorp/consul/command/keygen
github.com/hashicorp/consul/command/kv
github.com/hashicorp/consul/command/kv/del
github.com/hashicorp/consul/command/kv/impexp
github.com/hashicorp/consul/command/kv/exp
github.com/hashicorp/consul/command/kv/get
github.com/hashicorp/consul/command/kv/imp
github.com/hashicorp/consul/command/kv/put
github.com/hashicorp/consul/command/leave
github.com/hashicorp/consul/command/login
github.com/hashicorp/consul/command/logout
github.com/hashicorp/consul/command/maint
github.com/hashicorp/consul/command/members
github.com/hashicorp/consul/command/monitor
github.com/hashicorp/consul/command/operator
github.com/hashicorp/consul/command/operator/autopilot
github.com/hashicorp/consul/command/operator/autopilot/get
github.com/hashicorp/consul/command/operator/autopilot/set
github.com/hashicorp/consul/command/operator/raft
github.com/hashicorp/consul/command/operator/raft/listpeers
github.com/hashicorp/consul/command/operator/raft/removepeer
github.com/hashicorp/consul/command/reload
github.com/hashicorp/consul/command/rtt
github.com/hashicorp/consul/command/services
github.com/hashicorp/consul/command/services/deregister
github.com/hashicorp/consul/command/services/register
github.com/hashicorp/consul/command/snapshot
github.com/hashicorp/consul/command/snapshot/inspect
github.com/hashicorp/consul/command/snapshot/restore
github.com/hashicorp/consul/command/snapshot/save
github.com/hashicorp/consul/command/tls
github.com/hashicorp/consul/command/tls/ca
github.com/hashicorp/consul/command/tls/ca/create
github.com/hashicorp/consul/command/tls/cert
github.com/hashicorp/consul/command/tls/cert/create
github.com/hashicorp/consul/command/validate
github.com/hashicorp/consul/command/version
google.golang.org/grpc/health
github.com/hashicorp/consul/agent/consul/authmethod/testauth
github.com/hashicorp/consul/agent/mock
github.com/hashicorp/consul/vendor/github.com/hashicorp/go-plugin
github.com/hashicorp/consul/connect/certgen
github.com/hashicorp/consul/agent/connect/ca/plugin
github.com/hashicorp/consul/sdk/testutil
github.com/hashicorp/consul/testrpc
github.com/hashicorp/go-discover/provider/aws
github.com/hashicorp/go-discover/provider/gce
github.com/hashicorp/go-discover
github.com/hashicorp/consul/agent
github.com/hashicorp/consul/command/keyring
github.com/hashicorp/consul/command/agent
github.com/hashicorp/consul/command/lock
github.com/hashicorp/consul/command/watch
github.com/hashicorp/consul/command
github.com/hashicorp/consul
make[1]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
   debian/rules override_dh_auto_test
make[1]: Entering directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
PATH="/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/bin:${PATH}" \
        DH_GOLANG_EXCLUDES="test/integration api agent/cache agent/checks agent/connect agent/consul command/tls" \
        dh_auto_test -v --max-parallel=4 -- -short -failfast -timeout 7m
	cd _build && go test -vet=off -v -p 4 -short -failfast -timeout 7m github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/authmethod github.com/hashicorp/consul/command/acl/authmethod/create github.com/hashicorp/consul/command/acl/authmethod/delete github.com/hashicorp/consul/command/acl/authmethod/list github.com/hashicorp/consul/command/acl/authmethod/read github.com/hashicorp/consul/command/acl/authmethod/update github.com/hashicorp/consul/command/acl/bindingrule github.com/hashicorp/consul/command/acl/bindingrule/create github.com/hashicorp/consul/command/acl/bindingrule/delete github.com/hashicorp/consul/command/acl/bindingrule/list github.com/hashicorp/consul/command/acl/bindingrule/read github.com/hashicorp/consul/command/acl/bindingrule/update github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/role github.com/hashicorp/consul/command/acl/role/create github.com/hashicorp/consul/command/acl/role/delete github.com/hashicorp/consul/command/acl/role/list github.com/hashicorp/consul/command/acl/role/read github.com/hashicorp/consul/command/acl/role/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/config github.com/hashicorp/consul/command/config/delete github.com/hashicorp/consul/command/config/list github.com/hashicorp/consul/command/config/read github.com/hashicorp/consul/command/config/write github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/login github.com/hashicorp/consul/command/logout github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sdk/freeport github.com/hashicorp/consul/sdk/testutil github.com/hashicorp/consul/sdk/testutil/retry github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version
testing: warning: no tests to run
PASS
ok  	github.com/hashicorp/consul	0.468s [no tests to run]
=== RUN   TestACL
=== RUN   TestACL/DenyAll
=== RUN   TestACL/DenyAll/DenyACLRead
=== RUN   TestACL/DenyAll/DenyACLWrite
=== RUN   TestACL/DenyAll/DenyAgentRead
=== RUN   TestACL/DenyAll/DenyAgentWrite
=== RUN   TestACL/DenyAll/DenyEventRead
=== RUN   TestACL/DenyAll/DenyEventWrite
=== RUN   TestACL/DenyAll/DenyIntentionDefaultAllow
=== RUN   TestACL/DenyAll/DenyIntentionRead
=== RUN   TestACL/DenyAll/DenyIntentionWrite
=== RUN   TestACL/DenyAll/DenyKeyRead
=== RUN   TestACL/DenyAll/DenyKeyringRead
=== RUN   TestACL/DenyAll/DenyKeyringWrite
=== RUN   TestACL/DenyAll/DenyKeyWrite
=== RUN   TestACL/DenyAll/DenyNodeRead
=== RUN   TestACL/DenyAll/DenyNodeWrite
=== RUN   TestACL/DenyAll/DenyOperatorRead
=== RUN   TestACL/DenyAll/DenyOperatorWrite
=== RUN   TestACL/DenyAll/DenyPreparedQueryRead
=== RUN   TestACL/DenyAll/DenyPreparedQueryWrite
=== RUN   TestACL/DenyAll/DenyServiceRead
=== RUN   TestACL/DenyAll/DenyServiceWrite
=== RUN   TestACL/DenyAll/DenySessionRead
=== RUN   TestACL/DenyAll/DenySessionWrite
=== RUN   TestACL/DenyAll/DenySnapshot
=== RUN   TestACL/AllowAll
=== RUN   TestACL/AllowAll/DenyACLRead
=== RUN   TestACL/AllowAll/DenyACLWrite
=== RUN   TestACL/AllowAll/AllowAgentRead
=== RUN   TestACL/AllowAll/AllowAgentWrite
=== RUN   TestACL/AllowAll/AllowEventRead
=== RUN   TestACL/AllowAll/AllowEventWrite
=== RUN   TestACL/AllowAll/AllowIntentionDefaultAllow
=== RUN   TestACL/AllowAll/AllowIntentionRead
=== RUN   TestACL/AllowAll/AllowIntentionWrite
=== RUN   TestACL/AllowAll/AllowKeyRead
=== RUN   TestACL/AllowAll/AllowKeyringRead
=== RUN   TestACL/AllowAll/AllowKeyringWrite
=== RUN   TestACL/AllowAll/AllowKeyWrite
=== RUN   TestACL/AllowAll/AllowNodeRead
=== RUN   TestACL/AllowAll/AllowNodeWrite
=== RUN   TestACL/AllowAll/AllowOperatorRead
=== RUN   TestACL/AllowAll/AllowOperatorWrite
=== RUN   TestACL/AllowAll/AllowPreparedQueryRead
=== RUN   TestACL/AllowAll/AllowPreparedQueryWrite
=== RUN   TestACL/AllowAll/AllowServiceRead
=== RUN   TestACL/AllowAll/AllowServiceWrite
=== RUN   TestACL/AllowAll/AllowSessionRead
=== RUN   TestACL/AllowAll/AllowSessionWrite
=== RUN   TestACL/AllowAll/DenySnapshot
=== RUN   TestACL/ManageAll
=== RUN   TestACL/ManageAll/AllowACLRead
=== RUN   TestACL/ManageAll/AllowACLWrite
=== RUN   TestACL/ManageAll/AllowAgentRead
=== RUN   TestACL/ManageAll/AllowAgentWrite
=== RUN   TestACL/ManageAll/AllowEventRead
=== RUN   TestACL/ManageAll/AllowEventWrite
=== RUN   TestACL/ManageAll/AllowIntentionDefaultAllow
=== RUN   TestACL/ManageAll/AllowIntentionRead
=== RUN   TestACL/ManageAll/AllowIntentionWrite
=== RUN   TestACL/ManageAll/AllowKeyRead
=== RUN   TestACL/ManageAll/AllowKeyringRead
=== RUN   TestACL/ManageAll/AllowKeyringWrite
=== RUN   TestACL/ManageAll/AllowKeyWrite
=== RUN   TestACL/ManageAll/AllowNodeRead
=== RUN   TestACL/ManageAll/AllowNodeWrite
=== RUN   TestACL/ManageAll/AllowOperatorRead
=== RUN   TestACL/ManageAll/AllowOperatorWrite
=== RUN   TestACL/ManageAll/AllowPreparedQueryRead
=== RUN   TestACL/ManageAll/AllowPreparedQueryWrite
=== RUN   TestACL/ManageAll/AllowServiceRead
=== RUN   TestACL/ManageAll/AllowServiceWrite
=== RUN   TestACL/ManageAll/AllowSessionRead
=== RUN   TestACL/ManageAll/AllowSessionWrite
=== RUN   TestACL/ManageAll/AllowSnapshot
=== RUN   TestACL/AgentBasicDefaultDeny
=== RUN   TestACL/AgentBasicDefaultDeny/DefaultReadDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultDeny/DefaultWriteDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultDeny/ROReadAllowed.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultDeny/ROWriteDenied.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultDeny/RWWriteDenied.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultAllow
=== RUN   TestACL/AgentBasicDefaultAllow/DefaultReadDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultAllow/DefaultWriteDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultAllow/ROReadAllowed.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultAllow/ROWriteDenied.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultAllow/RWWriteDenied.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-sub)
=== RUN   TestACL/PreparedQueryDefaultAllow
=== RUN   TestACL/PreparedQueryDefaultAllow/ReadAllowed.Prefix(foo)
=== RUN   TestACL/PreparedQueryDefaultAllow/WriteAllowed.Prefix(foo)
=== RUN   TestACL/PreparedQueryDefaultAllow/ReadDenied.Prefix(other)
=== RUN   TestACL/PreparedQueryDefaultAllow/WriteDenied.Prefix(other)
=== RUN   TestACL/AgentNestedDefaultDeny
=== RUN   TestACL/AgentNestedDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultAllow
=== RUN   TestACL/AgentNestedDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny/ReadDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny/WriteDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyRead
=== RUN   TestACL/KeyringDefaultAllowPolicyRead/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyRead/WriteDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite/WriteAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyNone
=== RUN   TestACL/KeyringDefaultAllowPolicyNone/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyNone/WriteAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny/ReadDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny/WriteDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyRead
=== RUN   TestACL/KeyringDefaultDenyPolicyRead/ReadAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyRead/WriteDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite/ReadAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite/WriteAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyNone
=== RUN   TestACL/KeyringDefaultDenyPolicyNone/ReadDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyNone/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny/ReadDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyRead
=== RUN   TestACL/OperatorDefaultAllowPolicyRead/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyRead/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite/WriteAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyNone
=== RUN   TestACL/OperatorDefaultAllowPolicyNone/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyNone/WriteAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny/ReadDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny/WriteDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyRead
=== RUN   TestACL/OperatorDefaultDenyPolicyRead/ReadAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyRead/WriteDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite/ReadAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite/WriteAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyNone
=== RUN   TestACL/OperatorDefaultDenyPolicyNone/ReadDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyNone/WriteDenied
=== RUN   TestACL/NodeDefaultDeny
=== RUN   TestACL/NodeDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/NodeDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/NodeDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultAllow
=== RUN   TestACL/NodeDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/NodeDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/NodeDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultDeny
=== RUN   TestACL/SessionDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/SessionDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/SessionDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultAllow
=== RUN   TestACL/SessionDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/SessionDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/SessionDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/Parent
=== RUN   TestACL/Parent/KeyReadDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyWriteAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyWritePrefixAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyReadDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(zip/test)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(zip/test)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(zip/test)
=== RUN   TestACL/Parent/ServiceReadDenied.Prefix(fail)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(fail)
=== RUN   TestACL/Parent/ServiceReadAllowed.Prefix(other)
=== RUN   TestACL/Parent/ServiceWriteAllowed.Prefix(other)
=== RUN   TestACL/Parent/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(foo)
=== RUN   TestACL/Parent/ServiceReadDenied.Prefix(bar)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(foo)
=== RUN   TestACL/Parent/PreparedQueryReadAllowed.Prefix(foobar)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(foobar)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(barbaz)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(barbaz)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(baz)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(baz)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(nope)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(nope)
=== RUN   TestACL/Parent/ACLReadDenied
=== RUN   TestACL/Parent/ACLWriteDenied
=== RUN   TestACL/Parent/SnapshotDenied
=== RUN   TestACL/Parent/IntentionDefaultAllowDenied
=== RUN   TestACL/ComplexDefaultAllow
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(intbaz)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(intbaz)
=== RUN   TestACL/ComplexDefaultAllow/IntentionDefaultAllowAllowed
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(nope)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(nope)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zoo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zoo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zookeeper)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zookeeper)
=== RUN   TestACL/ExactMatchPrecedence
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/AgentWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/AgentWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/KeyWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/KeyWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)#01
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/SessionWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/SessionWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/EventReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/EventWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/EventReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/EventWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWriteDenied.Prefix(football)
=== RUN   TestACL/ACLRead
=== RUN   TestACL/ACLRead/ReadAllowed
=== RUN   TestACL/ACLRead/WriteDenied
=== RUN   TestACL/ACLRead#01
=== RUN   TestACL/ACLRead#01/ReadAllowed
=== RUN   TestACL/ACLRead#01/WriteAllowed
=== RUN   TestACL/KeyWritePrefixDefaultDeny
=== RUN   TestACL/KeyWritePrefixDefaultDeny/DeniedTopLevelPrefix.Prefix(foo)
=== RUN   TestACL/KeyWritePrefixDefaultDeny/AllowedTopLevelPrefix.Prefix(baz/)
=== RUN   TestACL/KeyWritePrefixDefaultDeny/AllowedPrefixWithNestedWrite.Prefix(foo/)
=== RUN   TestACL/KeyWritePrefixDefaultDeny/DenyPrefixWithNestedRead.Prefix(bar/)
=== RUN   TestACL/KeyWritePrefixDefaultDeny/DenyNoPrefixMatch.Prefix(te)
=== RUN   TestACL/KeyWritePrefixDefaultAllow
=== RUN   TestACL/KeyWritePrefixDefaultAllow/KeyWritePrefixDenied.Prefix(foo)
=== RUN   TestACL/KeyWritePrefixDefaultAllow/KeyWritePrefixAllowed.Prefix(bar)
--- PASS: TestACL (0.28s)
    --- PASS: TestACL/DenyAll (0.01s)
        --- PASS: TestACL/DenyAll/DenyACLRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyACLWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyAgentRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyAgentWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyEventRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyEventWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyringRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyringWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyNodeRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyNodeWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyOperatorRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyOperatorWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyPreparedQueryRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyPreparedQueryWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyServiceRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyServiceWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenySessionRead (0.00s)
        --- PASS: TestACL/DenyAll/DenySessionWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenySnapshot (0.00s)
    --- PASS: TestACL/AllowAll (0.01s)
        --- PASS: TestACL/AllowAll/DenyACLRead (0.00s)
        --- PASS: TestACL/AllowAll/DenyACLWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowAgentRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowAgentWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowEventRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowEventWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyringRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyringWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowNodeRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowNodeWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowOperatorRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowOperatorWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowPreparedQueryRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowPreparedQueryWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowServiceRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowServiceWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowSessionRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowSessionWrite (0.00s)
        --- PASS: TestACL/AllowAll/DenySnapshot (0.00s)
    --- PASS: TestACL/ManageAll (0.02s)
        --- PASS: TestACL/ManageAll/AllowACLRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowACLWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowAgentRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowAgentWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowEventRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowEventWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyringRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyringWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowNodeRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowNodeWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowOperatorRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowOperatorWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowPreparedQueryRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowPreparedQueryWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowServiceRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowServiceWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowSessionRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowSessionWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowSnapshot (0.00s)
    --- PASS: TestACL/AgentBasicDefaultDeny (0.01s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DefaultReadDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DefaultWriteDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROReadAllowed.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROWriteDenied.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWWriteDenied.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-sub) (0.00s)
    --- PASS: TestACL/AgentBasicDefaultAllow (0.01s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DefaultReadDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DefaultWriteDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROReadAllowed.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROWriteDenied.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWWriteDenied.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-sub) (0.00s)
    --- PASS: TestACL/PreparedQueryDefaultAllow (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/ReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/WriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/ReadDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/WriteDenied.Prefix(other) (0.00s)
    --- PASS: TestACL/AgentNestedDefaultDeny (0.02s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/AgentNestedDefaultAllow (0.02s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyDeny (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyRead (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyWrite (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyNone (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyNone/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyNone/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyDeny (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyRead (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyWrite (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyNone (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyNone/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyNone/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyDeny (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyRead (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyWrite (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyNone (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyNone/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyNone/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyDeny (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyRead (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyWrite (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyNone (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyNone/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyNone/WriteDenied (0.00s)
    --- PASS: TestACL/NodeDefaultDeny (0.01s)
        --- PASS: TestACL/NodeDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/NodeDefaultAllow (0.01s)
        --- PASS: TestACL/NodeDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/SessionDefaultDeny (0.02s)
        --- PASS: TestACL/SessionDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/SessionDefaultAllow (0.02s)
        --- PASS: TestACL/SessionDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/Parent (0.02s)
        --- PASS: TestACL/Parent/KeyReadDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyReadDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadDenied.Prefix(fail) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(fail) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/Parent/ACLReadDenied (0.00s)
        --- PASS: TestACL/Parent/ACLWriteDenied (0.00s)
        --- PASS: TestACL/Parent/SnapshotDenied (0.00s)
        --- PASS: TestACL/Parent/IntentionDefaultAllowDenied (0.00s)
    --- PASS: TestACL/ComplexDefaultAllow (0.04s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(intbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(intbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionDefaultAllowAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zookeeper) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zookeeper) (0.00s)
    --- PASS: TestACL/ExactMatchPrecedence (0.05s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWriteDenied.Prefix(football) (0.00s)
    --- PASS: TestACL/ACLRead (0.00s)
        --- PASS: TestACL/ACLRead/ReadAllowed (0.00s)
        --- PASS: TestACL/ACLRead/WriteDenied (0.00s)
    --- PASS: TestACL/ACLRead#01 (0.00s)
        --- PASS: TestACL/ACLRead#01/ReadAllowed (0.00s)
        --- PASS: TestACL/ACLRead#01/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyWritePrefixDefaultDeny (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/DeniedTopLevelPrefix.Prefix(foo) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/AllowedTopLevelPrefix.Prefix(baz/) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/AllowedPrefixWithNestedWrite.Prefix(foo/) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/DenyPrefixWithNestedRead.Prefix(bar/) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultDeny/DenyNoPrefixMatch.Prefix(te) (0.00s)
    --- PASS: TestACL/KeyWritePrefixDefaultAllow (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultAllow/KeyWritePrefixDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/KeyWritePrefixDefaultAllow/KeyWritePrefixAllowed.Prefix(bar) (0.00s)
=== RUN   TestRootAuthorizer
--- PASS: TestRootAuthorizer (0.00s)
=== RUN   TestACLEnforce
=== RUN   TestACLEnforce/RuleNoneRequireRead
=== RUN   TestACLEnforce/RuleNoneRequireWrite
=== RUN   TestACLEnforce/RuleNoneRequireList
=== RUN   TestACLEnforce/RuleReadRequireRead
=== RUN   TestACLEnforce/RuleReadRequireWrite
=== RUN   TestACLEnforce/RuleReadRequireList
=== RUN   TestACLEnforce/RuleListRequireRead
=== RUN   TestACLEnforce/RuleListRequireWrite
=== RUN   TestACLEnforce/RuleListRequireList
=== RUN   TestACLEnforce/RuleWritetRequireRead
=== RUN   TestACLEnforce/RuleWritetRequireWrite
=== RUN   TestACLEnforce/RuleWritetRequireList
=== RUN   TestACLEnforce/RuleDenyRequireRead
=== RUN   TestACLEnforce/RuleDenyRequireWrite
=== RUN   TestACLEnforce/RuleDenyRequireList
--- PASS: TestACLEnforce (0.01s)
    --- PASS: TestACLEnforce/RuleNoneRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleNoneRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleNoneRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireList (0.00s)
=== RUN   TestPolicySourceParse
=== RUN   TestPolicySourceParse/Legacy_Basic
=== RUN   TestPolicySourceParse/Legacy_(JSON)
=== RUN   TestPolicySourceParse/Service_No_Intentions_(Legacy)
=== RUN   TestPolicySourceParse/Service_Intentions_(Legacy)
=== RUN   TestPolicySourceParse/Service_Intention:_invalid_value_(Legacy)
=== RUN   TestPolicySourceParse/Bad_Policy_-_ACL
=== RUN   TestPolicySourceParse/Bad_Policy_-_Agent
=== RUN   TestPolicySourceParse/Bad_Policy_-_Agent_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Key
=== RUN   TestPolicySourceParse/Bad_Policy_-_Key_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Node
=== RUN   TestPolicySourceParse/Bad_Policy_-_Node_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Service
=== RUN   TestPolicySourceParse/Bad_Policy_-_Service_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Session
=== RUN   TestPolicySourceParse/Bad_Policy_-_Session_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Event
=== RUN   TestPolicySourceParse/Bad_Policy_-_Event_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Prepared_Query
=== RUN   TestPolicySourceParse/Bad_Policy_-_Prepared_Query_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Keyring
=== RUN   TestPolicySourceParse/Bad_Policy_-_Operator
=== RUN   TestPolicySourceParse/Keyring_Empty
=== RUN   TestPolicySourceParse/Operator_Empty
--- PASS: TestPolicySourceParse (0.03s)
    --- PASS: TestPolicySourceParse/Legacy_Basic (0.00s)
    --- PASS: TestPolicySourceParse/Legacy_(JSON) (0.00s)
    --- PASS: TestPolicySourceParse/Service_No_Intentions_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Service_Intentions_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Service_Intention:_invalid_value_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_ACL (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Agent (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Agent_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Key (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Key_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Node (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Node_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Service (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Service_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Session (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Session_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Event (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Event_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Prepared_Query (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Prepared_Query_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Keyring (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Operator (0.00s)
    --- PASS: TestPolicySourceParse/Keyring_Empty (0.00s)
    --- PASS: TestPolicySourceParse/Operator_Empty (0.00s)
=== RUN   TestMergePolicies
=== RUN   TestMergePolicies/Agents
=== RUN   TestMergePolicies/Events
=== RUN   TestMergePolicies/Node
=== RUN   TestMergePolicies/Keys
=== RUN   TestMergePolicies/Services
=== RUN   TestMergePolicies/Sessions
=== RUN   TestMergePolicies/Prepared_Queries
=== RUN   TestMergePolicies/Write_Precedence
=== RUN   TestMergePolicies/Deny_Precedence
=== RUN   TestMergePolicies/Read_Precedence
--- PASS: TestMergePolicies (0.02s)
    --- PASS: TestMergePolicies/Agents (0.00s)
    --- PASS: TestMergePolicies/Events (0.00s)
    --- PASS: TestMergePolicies/Node (0.00s)
    --- PASS: TestMergePolicies/Keys (0.00s)
    --- PASS: TestMergePolicies/Services (0.00s)
    --- PASS: TestMergePolicies/Sessions (0.00s)
    --- PASS: TestMergePolicies/Prepared_Queries (0.00s)
    --- PASS: TestMergePolicies/Write_Precedence (0.00s)
    --- PASS: TestMergePolicies/Deny_Precedence (0.00s)
    --- PASS: TestMergePolicies/Read_Precedence (0.00s)
=== RUN   TestRulesTranslate
--- PASS: TestRulesTranslate (0.00s)
=== RUN   TestRulesTranslate_GH5493
--- PASS: TestRulesTranslate_GH5493 (0.00s)
=== RUN   TestPrecedence
=== RUN   TestPrecedence/Deny_Over_Write
=== RUN   TestPrecedence/Deny_Over_List
=== RUN   TestPrecedence/Deny_Over_Read
=== RUN   TestPrecedence/Deny_Over_Unknown
=== RUN   TestPrecedence/Write_Over_List
=== RUN   TestPrecedence/Write_Over_Read
=== RUN   TestPrecedence/Write_Over_Unknown
=== RUN   TestPrecedence/List_Over_Read
=== RUN   TestPrecedence/List_Over_Unknown
=== RUN   TestPrecedence/Read_Over_Unknown
=== RUN   TestPrecedence/Write_Over_Deny
=== RUN   TestPrecedence/List_Over_Deny
=== RUN   TestPrecedence/Read_Over_Deny
=== RUN   TestPrecedence/Deny_Over_Unknown#01
=== RUN   TestPrecedence/List_Over_Write
=== RUN   TestPrecedence/Read_Over_Write
=== RUN   TestPrecedence/Unknown_Over_Write
=== RUN   TestPrecedence/Read_Over_List
=== RUN   TestPrecedence/Unknown_Over_List
=== RUN   TestPrecedence/Unknown_Over_Read
--- PASS: TestPrecedence (0.01s)
    --- PASS: TestPrecedence/Deny_Over_Write (0.00s)
    --- PASS: TestPrecedence/Deny_Over_List (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Read (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Write_Over_List (0.00s)
    --- PASS: TestPrecedence/Write_Over_Read (0.00s)
    --- PASS: TestPrecedence/Write_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/List_Over_Read (0.00s)
    --- PASS: TestPrecedence/List_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Read_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Write_Over_Deny (0.00s)
    --- PASS: TestPrecedence/List_Over_Deny (0.00s)
    --- PASS: TestPrecedence/Read_Over_Deny (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Unknown#01 (0.00s)
    --- PASS: TestPrecedence/List_Over_Write (0.00s)
    --- PASS: TestPrecedence/Read_Over_Write (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_Write (0.00s)
    --- PASS: TestPrecedence/Read_Over_List (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_List (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_Read (0.00s)
PASS
ok  	github.com/hashicorp/consul/acl	0.658s
=== RUN   TestACL_Legacy_Disabled_Response
=== PAUSE TestACL_Legacy_Disabled_Response
=== RUN   TestACL_Legacy_Update
=== PAUSE TestACL_Legacy_Update
=== RUN   TestACL_Legacy_UpdateUpsert
=== PAUSE TestACL_Legacy_UpdateUpsert
=== RUN   TestACL_Legacy_Destroy
=== PAUSE TestACL_Legacy_Destroy
=== RUN   TestACL_Legacy_Clone
=== PAUSE TestACL_Legacy_Clone
=== RUN   TestACL_Legacy_Get
=== PAUSE TestACL_Legacy_Get
=== RUN   TestACL_Legacy_List
--- SKIP: TestACL_Legacy_List (0.00s)
    acl_endpoint_legacy_test.go:253: DM-skipped
=== RUN   TestACLReplicationStatus
=== PAUSE TestACLReplicationStatus
=== RUN   TestACL_Disabled_Response
=== PAUSE TestACL_Disabled_Response
=== RUN   TestACL_Bootstrap
=== PAUSE TestACL_Bootstrap
=== RUN   TestACL_HTTP
=== PAUSE TestACL_HTTP
=== RUN   TestACL_LoginProcedure_HTTP
=== PAUSE TestACL_LoginProcedure_HTTP
=== RUN   TestACL_Version8
=== PAUSE TestACL_Version8
=== RUN   TestACL_AgentMasterToken
=== PAUSE TestACL_AgentMasterToken
=== RUN   TestACL_RootAuthorizersDenied
=== PAUSE TestACL_RootAuthorizersDenied
=== RUN   TestACL_vetServiceRegister
=== PAUSE TestACL_vetServiceRegister
=== RUN   TestACL_vetServiceUpdate
=== PAUSE TestACL_vetServiceUpdate
=== RUN   TestACL_vetCheckRegister
=== PAUSE TestACL_vetCheckRegister
=== RUN   TestACL_vetCheckUpdate
=== PAUSE TestACL_vetCheckUpdate
=== RUN   TestACL_filterMembers
=== PAUSE TestACL_filterMembers
=== RUN   TestACL_filterServices
=== PAUSE TestACL_filterServices
=== RUN   TestACL_filterChecks
=== PAUSE TestACL_filterChecks
=== RUN   TestAgent_Services
=== PAUSE TestAgent_Services
=== RUN   TestAgent_ServicesFiltered
=== PAUSE TestAgent_ServicesFiltered
=== RUN   TestAgent_Services_ExternalConnectProxy
=== PAUSE TestAgent_Services_ExternalConnectProxy
=== RUN   TestAgent_Services_Sidecar
=== PAUSE TestAgent_Services_Sidecar
=== RUN   TestAgent_Services_ACLFilter
=== PAUSE TestAgent_Services_ACLFilter
=== RUN   TestAgent_Service
--- SKIP: TestAgent_Service (0.00s)
    agent_endpoint_test.go:274: DM-skipped
=== RUN   TestAgent_Service_DeprecatedManagedProxy
=== PAUSE TestAgent_Service_DeprecatedManagedProxy
=== RUN   TestAgent_Checks
=== PAUSE TestAgent_Checks
=== RUN   TestAgent_ChecksWithFilter
=== PAUSE TestAgent_ChecksWithFilter
=== RUN   TestAgent_HealthServiceByID
=== PAUSE TestAgent_HealthServiceByID
=== RUN   TestAgent_HealthServiceByName
=== PAUSE TestAgent_HealthServiceByName
=== RUN   TestAgent_Checks_ACLFilter
=== PAUSE TestAgent_Checks_ACLFilter
=== RUN   TestAgent_Self
=== PAUSE TestAgent_Self
=== RUN   TestAgent_Self_ACLDeny
=== PAUSE TestAgent_Self_ACLDeny
=== RUN   TestAgent_Metrics_ACLDeny
=== PAUSE TestAgent_Metrics_ACLDeny
=== RUN   TestAgent_Reload
=== PAUSE TestAgent_Reload
=== RUN   TestAgent_Reload_ACLDeny
=== PAUSE TestAgent_Reload_ACLDeny
=== RUN   TestAgent_Members
=== PAUSE TestAgent_Members
=== RUN   TestAgent_Members_WAN
=== PAUSE TestAgent_Members_WAN
=== RUN   TestAgent_Members_ACLFilter
=== PAUSE TestAgent_Members_ACLFilter
=== RUN   TestAgent_Join
=== PAUSE TestAgent_Join
=== RUN   TestAgent_Join_WAN
=== PAUSE TestAgent_Join_WAN
=== RUN   TestAgent_Join_ACLDeny
=== PAUSE TestAgent_Join_ACLDeny
=== RUN   TestAgent_JoinLANNotify
=== PAUSE TestAgent_JoinLANNotify
=== RUN   TestAgent_Leave
--- SKIP: TestAgent_Leave (0.00s)
    agent_endpoint_test.go:1581: DM-skipped
=== RUN   TestAgent_Leave_ACLDeny
=== PAUSE TestAgent_Leave_ACLDeny
=== RUN   TestAgent_ForceLeave
--- SKIP: TestAgent_ForceLeave (0.00s)
    agent_endpoint_test.go:1649: DM-skipped
=== RUN   TestAgent_ForceLeave_ACLDeny
=== PAUSE TestAgent_ForceLeave_ACLDeny
=== RUN   TestAgent_RegisterCheck
=== PAUSE TestAgent_RegisterCheck
=== RUN   TestAgent_RegisterCheck_Scripts
=== PAUSE TestAgent_RegisterCheck_Scripts
=== RUN   TestAgent_RegisterCheckScriptsExecDisable
=== PAUSE TestAgent_RegisterCheckScriptsExecDisable
=== RUN   TestAgent_RegisterCheckScriptsExecRemoteDisable
=== PAUSE TestAgent_RegisterCheckScriptsExecRemoteDisable
=== RUN   TestAgent_RegisterCheck_Passing
=== PAUSE TestAgent_RegisterCheck_Passing
=== RUN   TestAgent_RegisterCheck_BadStatus
=== PAUSE TestAgent_RegisterCheck_BadStatus
=== RUN   TestAgent_RegisterCheck_ACLDeny
=== PAUSE TestAgent_RegisterCheck_ACLDeny
=== RUN   TestAgent_DeregisterCheck
=== PAUSE TestAgent_DeregisterCheck
=== RUN   TestAgent_DeregisterCheckACLDeny
=== PAUSE TestAgent_DeregisterCheckACLDeny
=== RUN   TestAgent_PassCheck
=== PAUSE TestAgent_PassCheck
=== RUN   TestAgent_PassCheck_ACLDeny
=== PAUSE TestAgent_PassCheck_ACLDeny
=== RUN   TestAgent_WarnCheck
=== PAUSE TestAgent_WarnCheck
=== RUN   TestAgent_WarnCheck_ACLDeny
=== PAUSE TestAgent_WarnCheck_ACLDeny
=== RUN   TestAgent_FailCheck
=== PAUSE TestAgent_FailCheck
=== RUN   TestAgent_FailCheck_ACLDeny
=== PAUSE TestAgent_FailCheck_ACLDeny
=== RUN   TestAgent_UpdateCheck
=== PAUSE TestAgent_UpdateCheck
=== RUN   TestAgent_UpdateCheck_ACLDeny
=== PAUSE TestAgent_UpdateCheck_ACLDeny
=== RUN   TestAgent_RegisterService
=== PAUSE TestAgent_RegisterService
=== RUN   TestAgent_RegisterService_TranslateKeys
=== PAUSE TestAgent_RegisterService_TranslateKeys
=== RUN   TestAgent_RegisterService_ACLDeny
=== PAUSE TestAgent_RegisterService_ACLDeny
=== RUN   TestAgent_RegisterService_InvalidAddress
=== PAUSE TestAgent_RegisterService_InvalidAddress
=== RUN   TestAgent_RegisterService_ManagedConnectProxy
=== PAUSE TestAgent_RegisterService_ManagedConnectProxy
=== RUN   TestAgent_RegisterService_ManagedConnectProxyDeprecated
=== PAUSE TestAgent_RegisterService_ManagedConnectProxyDeprecated
=== RUN   TestAgent_RegisterService_ManagedConnectProxy_Disabled
=== PAUSE TestAgent_RegisterService_ManagedConnectProxy_Disabled
=== RUN   TestAgent_RegisterService_UnmanagedConnectProxy
=== PAUSE TestAgent_RegisterService_UnmanagedConnectProxy
=== RUN   TestAgent_RegisterServiceDeregisterService_Sidecar
--- SKIP: TestAgent_RegisterServiceDeregisterService_Sidecar (0.00s)
    agent_endpoint_test.go:3107: DM-skipped
=== RUN   TestAgent_RegisterService_UnmanagedConnectProxyInvalid
=== PAUSE TestAgent_RegisterService_UnmanagedConnectProxyInvalid
=== RUN   TestAgent_RegisterService_ConnectNative
=== PAUSE TestAgent_RegisterService_ConnectNative
=== RUN   TestAgent_RegisterService_ScriptCheck_ExecDisable
=== PAUSE TestAgent_RegisterService_ScriptCheck_ExecDisable
=== RUN   TestAgent_RegisterService_ScriptCheck_ExecRemoteDisable
=== PAUSE TestAgent_RegisterService_ScriptCheck_ExecRemoteDisable
=== RUN   TestAgent_DeregisterService
=== PAUSE TestAgent_DeregisterService
=== RUN   TestAgent_DeregisterService_ACLDeny
=== PAUSE TestAgent_DeregisterService_ACLDeny
=== RUN   TestAgent_DeregisterService_withManagedProxy
=== PAUSE TestAgent_DeregisterService_withManagedProxy
=== RUN   TestAgent_DeregisterService_managedProxyDirect
=== PAUSE TestAgent_DeregisterService_managedProxyDirect
=== RUN   TestAgent_ServiceMaintenance_BadRequest
=== PAUSE TestAgent_ServiceMaintenance_BadRequest
=== RUN   TestAgent_ServiceMaintenance_Enable
--- SKIP: TestAgent_ServiceMaintenance_Enable (0.00s)
    agent_endpoint_test.go:3936: DM-skipped
=== RUN   TestAgent_ServiceMaintenance_Disable
=== PAUSE TestAgent_ServiceMaintenance_Disable
=== RUN   TestAgent_ServiceMaintenance_ACLDeny
=== PAUSE TestAgent_ServiceMaintenance_ACLDeny
=== RUN   TestAgent_NodeMaintenance_BadRequest
=== PAUSE TestAgent_NodeMaintenance_BadRequest
=== RUN   TestAgent_NodeMaintenance_Enable
=== PAUSE TestAgent_NodeMaintenance_Enable
=== RUN   TestAgent_NodeMaintenance_Disable
=== PAUSE TestAgent_NodeMaintenance_Disable
=== RUN   TestAgent_NodeMaintenance_ACLDeny
=== PAUSE TestAgent_NodeMaintenance_ACLDeny
=== RUN   TestAgent_RegisterCheck_Service
=== PAUSE TestAgent_RegisterCheck_Service
=== RUN   TestAgent_Monitor
--- SKIP: TestAgent_Monitor (0.00s)
    agent_endpoint_test.go:4189: DM-skipped
=== RUN   TestAgent_Monitor_ACLDeny
=== PAUSE TestAgent_Monitor_ACLDeny
=== RUN   TestAgent_Token
=== PAUSE TestAgent_Token
=== RUN   TestAgentConnectCARoots_empty
=== PAUSE TestAgentConnectCARoots_empty
=== RUN   TestAgentConnectCARoots_list
=== PAUSE TestAgentConnectCARoots_list
=== RUN   TestAgentConnectCALeafCert_aclDefaultDeny
=== PAUSE TestAgentConnectCALeafCert_aclDefaultDeny
=== RUN   TestAgentConnectCALeafCert_aclProxyToken
=== PAUSE TestAgentConnectCALeafCert_aclProxyToken
=== RUN   TestAgentConnectCALeafCert_aclProxyTokenOther
=== PAUSE TestAgentConnectCALeafCert_aclProxyTokenOther
=== RUN   TestAgentConnectCALeafCert_aclServiceWrite
=== PAUSE TestAgentConnectCALeafCert_aclServiceWrite
=== RUN   TestAgentConnectCALeafCert_aclServiceReadDeny
=== PAUSE TestAgentConnectCALeafCert_aclServiceReadDeny
=== RUN   TestAgentConnectCALeafCert_good
=== PAUSE TestAgentConnectCALeafCert_good
=== RUN   TestAgentConnectCALeafCert_goodNotLocal
--- SKIP: TestAgentConnectCALeafCert_goodNotLocal (0.00s)
    agent_endpoint_test.go:4991: DM-skipped
=== RUN   TestAgentConnectProxyConfig_Blocking
--- SKIP: TestAgentConnectProxyConfig_Blocking (0.00s)
    agent_endpoint_test.go:5128: DM-skipped
=== RUN   TestAgentConnectProxyConfig_aclDefaultDeny
=== PAUSE TestAgentConnectProxyConfig_aclDefaultDeny
=== RUN   TestAgentConnectProxyConfig_aclProxyToken
=== PAUSE TestAgentConnectProxyConfig_aclProxyToken
=== RUN   TestAgentConnectProxyConfig_aclServiceWrite
=== PAUSE TestAgentConnectProxyConfig_aclServiceWrite
=== RUN   TestAgentConnectProxyConfig_aclServiceReadDeny
=== PAUSE TestAgentConnectProxyConfig_aclServiceReadDeny
=== RUN   TestAgentConnectProxyConfig_ConfigHandling
--- SKIP: TestAgentConnectProxyConfig_ConfigHandling (0.00s)
    agent_endpoint_test.go:5540: DM-skipped
=== RUN   TestAgentConnectAuthorize_badBody
=== PAUSE TestAgentConnectAuthorize_badBody
=== RUN   TestAgentConnectAuthorize_noTarget
=== PAUSE TestAgentConnectAuthorize_noTarget
=== RUN   TestAgentConnectAuthorize_idInvalidFormat
=== PAUSE TestAgentConnectAuthorize_idInvalidFormat
=== RUN   TestAgentConnectAuthorize_idNotService
=== PAUSE TestAgentConnectAuthorize_idNotService
=== RUN   TestAgentConnectAuthorize_allow
=== PAUSE TestAgentConnectAuthorize_allow
=== RUN   TestAgentConnectAuthorize_deny
=== PAUSE TestAgentConnectAuthorize_deny
=== RUN   TestAgentConnectAuthorize_allowTrustDomain
=== PAUSE TestAgentConnectAuthorize_allowTrustDomain
=== RUN   TestAgentConnectAuthorize_denyWildcard
=== PAUSE TestAgentConnectAuthorize_denyWildcard
=== RUN   TestAgentConnectAuthorize_serviceWrite
=== PAUSE TestAgentConnectAuthorize_serviceWrite
=== RUN   TestAgentConnectAuthorize_defaultDeny
=== PAUSE TestAgentConnectAuthorize_defaultDeny
=== RUN   TestAgentConnectAuthorize_defaultAllow
=== PAUSE TestAgentConnectAuthorize_defaultAllow
=== RUN   TestAgent_Host
=== PAUSE TestAgent_Host
=== RUN   TestAgent_HostBadACL
=== PAUSE TestAgent_HostBadACL
=== RUN   TestAgent_MultiStartStop
=== RUN   TestAgent_MultiStartStop/#00
=== PAUSE TestAgent_MultiStartStop/#00
=== RUN   TestAgent_MultiStartStop/#01
=== PAUSE TestAgent_MultiStartStop/#01
=== RUN   TestAgent_MultiStartStop/#02
=== PAUSE TestAgent_MultiStartStop/#02
=== RUN   TestAgent_MultiStartStop/#03
=== PAUSE TestAgent_MultiStartStop/#03
=== RUN   TestAgent_MultiStartStop/#04
=== PAUSE TestAgent_MultiStartStop/#04
=== RUN   TestAgent_MultiStartStop/#05
=== PAUSE TestAgent_MultiStartStop/#05
=== RUN   TestAgent_MultiStartStop/#06
=== PAUSE TestAgent_MultiStartStop/#06
=== RUN   TestAgent_MultiStartStop/#07
=== PAUSE TestAgent_MultiStartStop/#07
=== RUN   TestAgent_MultiStartStop/#08
=== PAUSE TestAgent_MultiStartStop/#08
=== RUN   TestAgent_MultiStartStop/#09
=== PAUSE TestAgent_MultiStartStop/#09
=== CONT  TestAgent_MultiStartStop/#09
=== CONT  TestAgent_MultiStartStop/#00
=== CONT  TestAgent_MultiStartStop/#08
=== CONT  TestAgent_MultiStartStop/#07
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:11.815450 [WARN] agent: Node name "Node b70f9210-d8c2-0415-797a-462f58489d2e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:11.816247 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:11.817143 [WARN] agent: Node name "Node b5cb9893-d5a3-6e01-018a-304ed3fc1dd6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:11.817596 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:11.829299 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:11.833302 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:11.840209 [WARN] agent: Node name "Node ecc97545-d1f1-940e-168d-a15fa6d050ed" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:11.841101 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:11.845290 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:11.871646 [WARN] agent: Node name "Node 1188ae26-2dc6-b46f-ee97-5388a5baef47" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:11.872475 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:11.876315 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1188ae26-2dc6-b46f-ee97-5388a5baef47 Address:127.0.0.1:34012}]
2019/12/06 06:01:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b70f9210-d8c2-0415-797a-462f58489d2e Address:127.0.0.1:34018}]
2019/12/06 06:01:13 [INFO]  raft: Node at 127.0.0.1:34012 [Follower] entering Follower state (Leader: "")
2019/12/06 06:01:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b5cb9893-d5a3-6e01-018a-304ed3fc1dd6 Address:127.0.0.1:34006}]
2019/12/06 06:01:13 [INFO]  raft: Node at 127.0.0.1:34018 [Follower] entering Follower state (Leader: "")
2019/12/06 06:01:13 [INFO]  raft: Node at 127.0.0.1:34006 [Follower] entering Follower state (Leader: "")
2019/12/06 06:01:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ecc97545-d1f1-940e-168d-a15fa6d050ed Address:127.0.0.1:34024}]
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:13.322238 [INFO] serf: EventMemberJoin: Node b70f9210-d8c2-0415-797a-462f58489d2e.dc1 127.0.0.1
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:13.322572 [INFO] serf: EventMemberJoin: Node b5cb9893-d5a3-6e01-018a-304ed3fc1dd6.dc1 127.0.0.1
2019/12/06 06:01:13 [INFO]  raft: Node at 127.0.0.1:34024 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:13.327810 [INFO] serf: EventMemberJoin: Node ecc97545-d1f1-940e-168d-a15fa6d050ed.dc1 127.0.0.1
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:13.330586 [INFO] serf: EventMemberJoin: Node b70f9210-d8c2-0415-797a-462f58489d2e 127.0.0.1
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:13.331661 [INFO] consul: Handled member-join event for server "Node b70f9210-d8c2-0415-797a-462f58489d2e.dc1" in area "wan"
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:13.332291 [INFO] serf: EventMemberJoin: Node 1188ae26-2dc6-b46f-ee97-5388a5baef47.dc1 127.0.0.1
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:13.337177 [INFO] serf: EventMemberJoin: Node b5cb9893-d5a3-6e01-018a-304ed3fc1dd6 127.0.0.1
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:13.351052 [INFO] consul: Adding LAN server Node b5cb9893-d5a3-6e01-018a-304ed3fc1dd6 (Addr: tcp/127.0.0.1:34006) (DC: dc1)
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:13.359400 [INFO] consul: Handled member-join event for server "Node b5cb9893-d5a3-6e01-018a-304ed3fc1dd6.dc1" in area "wan"
2019/12/06 06:01:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:13 [INFO]  raft: Node at 127.0.0.1:34012 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:13 [INFO]  raft: Node at 127.0.0.1:34006 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:13.365047 [INFO] consul: Adding LAN server Node b70f9210-d8c2-0415-797a-462f58489d2e (Addr: tcp/127.0.0.1:34018) (DC: dc1)
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:13.369565 [INFO] agent: Started DNS server 127.0.0.1:34001 (udp)
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:13.370024 [INFO] serf: EventMemberJoin: Node 1188ae26-2dc6-b46f-ee97-5388a5baef47 127.0.0.1
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:13.370117 [INFO] agent: Started DNS server 127.0.0.1:34001 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:13.377706 [INFO] agent: Started HTTP server on 127.0.0.1:34002 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:13.377885 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:13.383600 [INFO] agent: Started DNS server 127.0.0.1:34007 (udp)
2019/12/06 06:01:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:13 [INFO]  raft: Node at 127.0.0.1:34018 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:13.388271 [INFO] consul: Adding LAN server Node 1188ae26-2dc6-b46f-ee97-5388a5baef47 (Addr: tcp/127.0.0.1:34012) (DC: dc1)
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:13.397249 [INFO] agent: Started DNS server 127.0.0.1:34013 (udp)
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:13.397774 [INFO] consul: Handled member-join event for server "Node 1188ae26-2dc6-b46f-ee97-5388a5baef47.dc1" in area "wan"
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:13.405828 [INFO] agent: Started DNS server 127.0.0.1:34013 (tcp)
2019/12/06 06:01:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:13 [INFO]  raft: Node at 127.0.0.1:34024 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:13.406521 [INFO] serf: EventMemberJoin: Node ecc97545-d1f1-940e-168d-a15fa6d050ed 127.0.0.1
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:13.411102 [INFO] agent: Started HTTP server on 127.0.0.1:34014 (tcp)
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:13.415817 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:13.417109 [INFO] agent: Started DNS server 127.0.0.1:34007 (tcp)
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:13.425492 [INFO] agent: Started HTTP server on 127.0.0.1:34008 (tcp)
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:13.425802 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:13.427993 [INFO] agent: Started DNS server 127.0.0.1:34019 (udp)
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:13.428079 [INFO] agent: Started DNS server 127.0.0.1:34019 (tcp)
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:13.428509 [INFO] consul: Adding LAN server Node ecc97545-d1f1-940e-168d-a15fa6d050ed (Addr: tcp/127.0.0.1:34024) (DC: dc1)
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:13.428914 [INFO] consul: Handled member-join event for server "Node ecc97545-d1f1-940e-168d-a15fa6d050ed.dc1" in area "wan"
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:13.430547 [INFO] agent: Started HTTP server on 127.0.0.1:34020 (tcp)
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:13.430673 [INFO] agent: started state syncer
2019/12/06 06:01:14 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:14 [INFO]  raft: Node at 127.0.0.1:34012 [Leader] entering Leader state
2019/12/06 06:01:14 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:14 [INFO]  raft: Node at 127.0.0.1:34006 [Leader] entering Leader state
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.203325 [INFO] consul: cluster leadership acquired
2019/12/06 06:01:14 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:14 [INFO]  raft: Node at 127.0.0.1:34018 [Leader] entering Leader state
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.203956 [INFO] consul: New leader elected: Node 1188ae26-2dc6-b46f-ee97-5388a5baef47
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.204435 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.204889 [INFO] consul: New leader elected: Node b5cb9893-d5a3-6e01-018a-304ed3fc1dd6
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.205693 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.206096 [INFO] consul: New leader elected: Node b70f9210-d8c2-0415-797a-462f58489d2e
2019/12/06 06:01:14 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:14 [INFO]  raft: Node at 127.0.0.1:34024 [Leader] entering Leader state
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.212594 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.213032 [INFO] consul: New leader elected: Node ecc97545-d1f1-940e-168d-a15fa6d050ed
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.455921 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.456040 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.456087 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.471389 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.471514 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.471566 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.485007 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.485111 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.485163 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.600444 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.603223 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.608743 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.717187 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.717313 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.717366 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.725743 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.726504 [INFO] manager: shutting down
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.727085 [INFO] agent: consul server down
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.727143 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.727196 [INFO] agent: Stopping DNS server 127.0.0.1:34013 (tcp)
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.727348 [INFO] agent: Stopping DNS server 127.0.0.1:34013 (udp)
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.727517 [INFO] agent: Stopping HTTP server 127.0.0.1:34014 (tcp)
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.727735 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.727818 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#06
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.728363 [INFO] manager: shutting down
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.728541 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.728654 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#07 - 2019/12/06 06:01:14.728709 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.730179 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.730292 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.731964 [INFO] manager: shutting down
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.733101 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.769968 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.770102 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:14.799955 [WARN] agent: Node name "Node 033980d6-ac6f-1a92-9403-b1b56d41fb93" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:14.800458 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:14.802736 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.849683 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.966315 [INFO] manager: shutting down
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.967191 [INFO] agent: consul server down
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.967267 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.967332 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (tcp)
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.967538 [INFO] agent: consul server down
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.967588 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.967552 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (udp)
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.967642 [INFO] agent: Stopping DNS server 127.0.0.1:34007 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.967856 [INFO] agent: Stopping HTTP server 127.0.0.1:34002 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.967857 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.967926 [INFO] agent: Stopping DNS server 127.0.0.1:34007 (udp)
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.967771 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.968063 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.968064 [INFO] agent: Stopping HTTP server 127.0.0.1:34008 (tcp)
TestAgent_MultiStartStop/#09 - 2019/12/06 06:01:14.968199 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.968254 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.968258 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#00 - 2019/12/06 06:01:14.968319 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#05
=== CONT  TestAgent_MultiStartStop/#04
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.978006 [INFO] agent: consul server down
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.978077 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.978138 [INFO] agent: Stopping DNS server 127.0.0.1:34019 (tcp)
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.978286 [INFO] agent: Stopping DNS server 127.0.0.1:34019 (udp)
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.978467 [INFO] agent: Stopping HTTP server 127.0.0.1:34020 (tcp)
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.978686 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.978768 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#03
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.979062 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.979394 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#08 - 2019/12/06 06:01:14.979628 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:15.071887 [WARN] agent: Node name "Node 666099ac-b520-515e-5dd7-f40ca413106b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:15.072515 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:15.075858 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:15.128482 [WARN] agent: Node name "Node 70dcf182-8739-516d-9a5c-7144c0fbe47f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:15.128915 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:15.128908 [WARN] agent: Node name "Node 7db290c8-6cc5-b481-76c7-dd74d4d08773" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:15.131093 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:15.131561 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:15.134473 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:033980d6-ac6f-1a92-9403-b1b56d41fb93 Address:127.0.0.1:34030}]
2019/12/06 06:01:15 [INFO]  raft: Node at 127.0.0.1:34030 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:15.945783 [INFO] serf: EventMemberJoin: Node 033980d6-ac6f-1a92-9403-b1b56d41fb93.dc1 127.0.0.1
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:15.953561 [INFO] serf: EventMemberJoin: Node 033980d6-ac6f-1a92-9403-b1b56d41fb93 127.0.0.1
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:15.955049 [INFO] consul: Adding LAN server Node 033980d6-ac6f-1a92-9403-b1b56d41fb93 (Addr: tcp/127.0.0.1:34030) (DC: dc1)
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:15.955111 [INFO] consul: Handled member-join event for server "Node 033980d6-ac6f-1a92-9403-b1b56d41fb93.dc1" in area "wan"
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:15.956533 [INFO] agent: Started DNS server 127.0.0.1:34025 (tcp)
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:15.956611 [INFO] agent: Started DNS server 127.0.0.1:34025 (udp)
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:15.959018 [INFO] agent: Started HTTP server on 127.0.0.1:34026 (tcp)
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:15.959120 [INFO] agent: started state syncer
2019/12/06 06:01:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:15 [INFO]  raft: Node at 127.0.0.1:34030 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:666099ac-b520-515e-5dd7-f40ca413106b Address:127.0.0.1:34036}]
2019/12/06 06:01:16 [INFO]  raft: Node at 127.0.0.1:34036 [Follower] entering Follower state (Leader: "")
2019/12/06 06:01:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7db290c8-6cc5-b481-76c7-dd74d4d08773 Address:127.0.0.1:34048}]
2019/12/06 06:01:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:70dcf182-8739-516d-9a5c-7144c0fbe47f Address:127.0.0.1:34042}]
2019/12/06 06:01:16 [INFO]  raft: Node at 127.0.0.1:34048 [Follower] entering Follower state (Leader: "")
2019/12/06 06:01:16 [INFO]  raft: Node at 127.0.0.1:34042 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:16.398837 [INFO] serf: EventMemberJoin: Node 70dcf182-8739-516d-9a5c-7144c0fbe47f.dc1 127.0.0.1
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:16.398970 [INFO] serf: EventMemberJoin: Node 7db290c8-6cc5-b481-76c7-dd74d4d08773.dc1 127.0.0.1
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:16.402714 [INFO] serf: EventMemberJoin: Node 666099ac-b520-515e-5dd7-f40ca413106b.dc1 127.0.0.1
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:16.402745 [INFO] serf: EventMemberJoin: Node 7db290c8-6cc5-b481-76c7-dd74d4d08773 127.0.0.1
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:16.404609 [INFO] consul: Adding LAN server Node 7db290c8-6cc5-b481-76c7-dd74d4d08773 (Addr: tcp/127.0.0.1:34048) (DC: dc1)
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:16.404899 [INFO] consul: Handled member-join event for server "Node 7db290c8-6cc5-b481-76c7-dd74d4d08773.dc1" in area "wan"
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:16.405966 [INFO] agent: Started DNS server 127.0.0.1:34043 (udp)
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:16.406042 [INFO] agent: Started DNS server 127.0.0.1:34043 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:16.408454 [INFO] agent: Started HTTP server on 127.0.0.1:34044 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:16.408547 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:16.410575 [INFO] serf: EventMemberJoin: Node 70dcf182-8739-516d-9a5c-7144c0fbe47f 127.0.0.1
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:16.413018 [INFO] agent: Started DNS server 127.0.0.1:34037 (udp)
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:16.413844 [INFO] consul: Adding LAN server Node 70dcf182-8739-516d-9a5c-7144c0fbe47f (Addr: tcp/127.0.0.1:34042) (DC: dc1)
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:16.414392 [INFO] consul: Handled member-join event for server "Node 70dcf182-8739-516d-9a5c-7144c0fbe47f.dc1" in area "wan"
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:16.415130 [INFO] agent: Started DNS server 127.0.0.1:34037 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:16.415983 [INFO] serf: EventMemberJoin: Node 666099ac-b520-515e-5dd7-f40ca413106b 127.0.0.1
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:16.417371 [INFO] agent: Started DNS server 127.0.0.1:34031 (udp)
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:16.418827 [INFO] consul: Adding LAN server Node 666099ac-b520-515e-5dd7-f40ca413106b (Addr: tcp/127.0.0.1:34036) (DC: dc1)
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:16.419093 [INFO] consul: Handled member-join event for server "Node 666099ac-b520-515e-5dd7-f40ca413106b.dc1" in area "wan"
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:16.420250 [INFO] agent: Started DNS server 127.0.0.1:34031 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:16.422691 [INFO] agent: Started HTTP server on 127.0.0.1:34032 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:16.423030 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:16.424256 [INFO] agent: Started HTTP server on 127.0.0.1:34038 (tcp)
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:16.424659 [INFO] agent: started state syncer
2019/12/06 06:01:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:16 [INFO]  raft: Node at 127.0.0.1:34036 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:16 [INFO]  raft: Node at 127.0.0.1:34048 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:16 [INFO]  raft: Node at 127.0.0.1:34042 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:16 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:16 [INFO]  raft: Node at 127.0.0.1:34030 [Leader] entering Leader state
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:16.726576 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:16.727149 [INFO] consul: New leader elected: Node 033980d6-ac6f-1a92-9403-b1b56d41fb93
2019/12/06 06:01:17 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:17 [INFO]  raft: Node at 127.0.0.1:34042 [Leader] entering Leader state
2019/12/06 06:01:17 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:17 [INFO]  raft: Node at 127.0.0.1:34036 [Leader] entering Leader state
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:17.053230 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:17.053687 [INFO] consul: New leader elected: Node 70dcf182-8739-516d-9a5c-7144c0fbe47f
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.053930 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.054282 [INFO] consul: New leader elected: Node 666099ac-b520-515e-5dd7-f40ca413106b
2019/12/06 06:01:17 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:17 [INFO]  raft: Node at 127.0.0.1:34048 [Leader] entering Leader state
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.054964 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.055430 [INFO] consul: New leader elected: Node 7db290c8-6cc5-b481-76c7-dd74d4d08773
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.143615 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.143726 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.143773 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.224554 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.346577 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.346689 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.346737 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.349566 [INFO] manager: shutting down
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.349744 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.350094 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.350603 [INFO] agent: consul server down
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.350666 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.350738 [INFO] agent: Stopping DNS server 127.0.0.1:34025 (tcp)
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.350922 [INFO] agent: Stopping DNS server 127.0.0.1:34025 (udp)
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.351107 [INFO] agent: Stopping HTTP server 127.0.0.1:34026 (tcp)
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.351352 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.351440 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#02
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.356390 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#06 - 2019/12/06 06:01:17.356469 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.446787 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.447337 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.453020 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.462262 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.462368 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.462423 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:17.522461 [WARN] agent: Node name "Node 8cb36257-c038-8125-6813-a28fdf036933" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:17.523075 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:17.525836 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.574610 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.577544 [INFO] manager: shutting down
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.666714 [INFO] agent: consul server down
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.666821 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.666904 [INFO] agent: Stopping DNS server 127.0.0.1:34031 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.667087 [INFO] agent: Stopping DNS server 127.0.0.1:34031 (udp)
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.667297 [INFO] agent: Stopping HTTP server 127.0.0.1:34032 (tcp)
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.667551 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.667649 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#01
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.668994 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.669354 [INFO] manager: shutting down
TestAgent_MultiStartStop/#05 - 2019/12/06 06:01:17.669401 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:17.703918 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:17.704015 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:17.704065 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:17.733523 [WARN] agent: Node name "Node f77758c2-0996-248f-f551-565a81ec077b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:17.734063 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:17.737861 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:17.751712 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:17.874560 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.958287 [INFO] agent: consul server down
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.958442 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.958552 [INFO] agent: Stopping DNS server 127.0.0.1:34043 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.958728 [INFO] agent: Stopping DNS server 127.0.0.1:34043 (udp)
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:17.958763 [INFO] manager: shutting down
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.958939 [INFO] agent: Stopping HTTP server 127.0.0.1:34044 (tcp)
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.959239 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.959345 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.958945 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_MultiStartStop/#04 - 2019/12/06 06:01:17.960442 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.060780 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.061100 [INFO] agent: consul server down
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.061171 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.061240 [INFO] agent: Stopping DNS server 127.0.0.1:34037 (tcp)
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.061408 [INFO] agent: Stopping DNS server 127.0.0.1:34037 (udp)
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.061575 [INFO] agent: Stopping HTTP server 127.0.0.1:34038 (tcp)
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.061829 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.061901 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.063421 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.063662 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.063735 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.063785 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestAgent_MultiStartStop/#03 - 2019/12/06 06:01:18.063837 [ERR] consul: failed to transfer leadership in 3 attempts
2019/12/06 06:01:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8cb36257-c038-8125-6813-a28fdf036933 Address:127.0.0.1:34054}]
2019/12/06 06:01:18 [INFO]  raft: Node at 127.0.0.1:34054 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:18.742837 [INFO] serf: EventMemberJoin: Node 8cb36257-c038-8125-6813-a28fdf036933.dc1 127.0.0.1
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:18.748388 [INFO] serf: EventMemberJoin: Node 8cb36257-c038-8125-6813-a28fdf036933 127.0.0.1
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:18.749430 [INFO] consul: Adding LAN server Node 8cb36257-c038-8125-6813-a28fdf036933 (Addr: tcp/127.0.0.1:34054) (DC: dc1)
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:18.749672 [INFO] consul: Handled member-join event for server "Node 8cb36257-c038-8125-6813-a28fdf036933.dc1" in area "wan"
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:18.750317 [INFO] agent: Started DNS server 127.0.0.1:34049 (udp)
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:18.750388 [INFO] agent: Started DNS server 127.0.0.1:34049 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:18.752927 [INFO] agent: Started HTTP server on 127.0.0.1:34050 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:18.753025 [INFO] agent: started state syncer
2019/12/06 06:01:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:18 [INFO]  raft: Node at 127.0.0.1:34054 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f77758c2-0996-248f-f551-565a81ec077b Address:127.0.0.1:34060}]
2019/12/06 06:01:18 [INFO]  raft: Node at 127.0.0.1:34060 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:18.861933 [INFO] serf: EventMemberJoin: Node f77758c2-0996-248f-f551-565a81ec077b.dc1 127.0.0.1
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:18.865232 [INFO] serf: EventMemberJoin: Node f77758c2-0996-248f-f551-565a81ec077b 127.0.0.1
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:18.866866 [INFO] agent: Started DNS server 127.0.0.1:34055 (udp)
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:18.867401 [INFO] consul: Adding LAN server Node f77758c2-0996-248f-f551-565a81ec077b (Addr: tcp/127.0.0.1:34060) (DC: dc1)
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:18.867648 [INFO] consul: Handled member-join event for server "Node f77758c2-0996-248f-f551-565a81ec077b.dc1" in area "wan"
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:18.868149 [INFO] agent: Started DNS server 127.0.0.1:34055 (tcp)
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:18.870686 [INFO] agent: Started HTTP server on 127.0.0.1:34056 (tcp)
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:18.870805 [INFO] agent: started state syncer
2019/12/06 06:01:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:18 [INFO]  raft: Node at 127.0.0.1:34060 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:19 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:19 [INFO]  raft: Node at 127.0.0.1:34054 [Leader] entering Leader state
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:19.470428 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:19.470850 [INFO] consul: New leader elected: Node 8cb36257-c038-8125-6813-a28fdf036933
2019/12/06 06:01:19 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:19 [INFO]  raft: Node at 127.0.0.1:34060 [Leader] entering Leader state
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:19.633366 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:19.633853 [INFO] consul: New leader elected: Node f77758c2-0996-248f-f551-565a81ec077b
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:19.892093 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:19.892215 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:19.941644 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:19.941755 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:19.941815 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:19.981574 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:19.981692 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:19.981744 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.031734 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.107892 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.108994 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.109119 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.114497 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.241388 [INFO] manager: shutting down
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.242999 [INFO] manager: shutting down
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.341732 [INFO] agent: consul server down
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.341860 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.341937 [INFO] agent: Stopping DNS server 127.0.0.1:34055 (tcp)
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.342112 [INFO] agent: Stopping DNS server 127.0.0.1:34055 (udp)
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.342312 [INFO] agent: Stopping HTTP server 127.0.0.1:34056 (tcp)
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.342547 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.342622 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.342792 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.343138 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#01 - 2019/12/06 06:01:20.343417 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.348385 [INFO] agent: consul server down
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.348469 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.348529 [INFO] agent: Stopping DNS server 127.0.0.1:34049 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.348675 [INFO] agent: Stopping DNS server 127.0.0.1:34049 (udp)
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.348848 [INFO] agent: Stopping HTTP server 127.0.0.1:34050 (tcp)
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.349101 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.349246 [INFO] agent: Endpoints down
--- PASS: TestAgent_MultiStartStop (0.00s)
    --- PASS: TestAgent_MultiStartStop/#07 (3.11s)
    --- PASS: TestAgent_MultiStartStop/#09 (3.35s)
    --- PASS: TestAgent_MultiStartStop/#00 (3.35s)
    --- PASS: TestAgent_MultiStartStop/#08 (3.36s)
    --- PASS: TestAgent_MultiStartStop/#06 (2.62s)
    --- PASS: TestAgent_MultiStartStop/#05 (2.70s)
    --- PASS: TestAgent_MultiStartStop/#04 (2.99s)
    --- PASS: TestAgent_MultiStartStop/#03 (3.08s)
    --- PASS: TestAgent_MultiStartStop/#01 (2.67s)
    --- PASS: TestAgent_MultiStartStop/#02 (3.00s)
=== RUN   TestAgent_ConnectClusterIDConfig
=== RUN   TestAgent_ConnectClusterIDConfig/default_TestAgent_has_fixed_cluster_id
TestAgent_MultiStartStop/#02 - 2019/12/06 06:01:20.352214 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/12/06 06:01:20.417348 [WARN] agent: Node name "Node 6042daf2-2d7c-bb44-1aa8-590bf3b80570" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/12/06 06:01:20.418334 [DEBUG] tlsutil: Update with version 1
test - 2019/12/06 06:01:20.420608 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6042daf2-2d7c-bb44-1aa8-590bf3b80570 Address:127.0.0.1:34066}]
2019/12/06 06:01:21 [INFO]  raft: Node at 127.0.0.1:34066 [Follower] entering Follower state (Leader: "")
test - 2019/12/06 06:01:21.507025 [INFO] serf: EventMemberJoin: Node 6042daf2-2d7c-bb44-1aa8-590bf3b80570.dc1 127.0.0.1
test - 2019/12/06 06:01:21.523679 [INFO] serf: EventMemberJoin: Node 6042daf2-2d7c-bb44-1aa8-590bf3b80570 127.0.0.1
test - 2019/12/06 06:01:21.525631 [INFO] agent: Started DNS server 127.0.0.1:34061 (udp)
test - 2019/12/06 06:01:21.526290 [INFO] consul: Handled member-join event for server "Node 6042daf2-2d7c-bb44-1aa8-590bf3b80570.dc1" in area "wan"
test - 2019/12/06 06:01:21.526529 [INFO] agent: Started DNS server 127.0.0.1:34061 (tcp)
test - 2019/12/06 06:01:21.528609 [INFO] consul: Adding LAN server Node 6042daf2-2d7c-bb44-1aa8-590bf3b80570 (Addr: tcp/127.0.0.1:34066) (DC: dc1)
2019/12/06 06:01:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:21 [INFO]  raft: Node at 127.0.0.1:34066 [Candidate] entering Candidate state in term 2
test - 2019/12/06 06:01:21.533794 [INFO] agent: Started HTTP server on 127.0.0.1:34062 (tcp)
test - 2019/12/06 06:01:21.533897 [INFO] agent: started state syncer
2019/12/06 06:01:21 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:21 [INFO]  raft: Node at 127.0.0.1:34066 [Leader] entering Leader state
test - 2019/12/06 06:01:21.959003 [INFO] consul: cluster leadership acquired
test - 2019/12/06 06:01:21.959581 [INFO] consul: New leader elected: Node 6042daf2-2d7c-bb44-1aa8-590bf3b80570
test - 2019/12/06 06:01:22.243720 [INFO] agent: Synced node info
test - 2019/12/06 06:01:22.271134 [INFO] agent: Requesting shutdown
test - 2019/12/06 06:01:22.271256 [INFO] consul: shutting down server
test - 2019/12/06 06:01:22.271313 [WARN] serf: Shutdown without a Leave
test - 2019/12/06 06:01:22.332936 [WARN] serf: Shutdown without a Leave
test - 2019/12/06 06:01:22.399725 [INFO] manager: shutting down
test - 2019/12/06 06:01:22.400524 [INFO] agent: consul server down
test - 2019/12/06 06:01:22.400580 [ERR] consul: failed to establish leadership: raft is already shutdown
test - 2019/12/06 06:01:22.400596 [INFO] agent: shutdown complete
test - 2019/12/06 06:01:22.400746 [INFO] agent: Stopping DNS server 127.0.0.1:34061 (tcp)
test - 2019/12/06 06:01:22.400907 [INFO] agent: Stopping DNS server 127.0.0.1:34061 (udp)
test - 2019/12/06 06:01:22.401076 [INFO] agent: Stopping HTTP server 127.0.0.1:34062 (tcp)
test - 2019/12/06 06:01:22.401288 [INFO] agent: Waiting for endpoints to shut down
test - 2019/12/06 06:01:22.401337 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
test - 2019/12/06 06:01:22.401404 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
test - 2019/12/06 06:01:22.401461 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
test - 2019/12/06 06:01:22.401514 [ERR] consul: failed to transfer leadership in 3 attempts
test - 2019/12/06 06:01:22.401359 [INFO] agent: Endpoints down
=== RUN   TestAgent_ConnectClusterIDConfig/no_cluster_ID_specified_sets_to_test_ID
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/12/06 06:01:22.465706 [WARN] agent: Node name "Node 83836365-0565-69cc-ff4a-1ff66ad8daa3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/12/06 06:01:22.466274 [DEBUG] tlsutil: Update with version 1
test - 2019/12/06 06:01:22.468641 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:83836365-0565-69cc-ff4a-1ff66ad8daa3 Address:127.0.0.1:34072}]
2019/12/06 06:01:23 [INFO]  raft: Node at 127.0.0.1:34072 [Follower] entering Follower state (Leader: "")
test - 2019/12/06 06:01:23.878596 [INFO] serf: EventMemberJoin: Node 83836365-0565-69cc-ff4a-1ff66ad8daa3.dc1 127.0.0.1
test - 2019/12/06 06:01:23.889455 [INFO] serf: EventMemberJoin: Node 83836365-0565-69cc-ff4a-1ff66ad8daa3 127.0.0.1
test - 2019/12/06 06:01:23.891643 [INFO] consul: Adding LAN server Node 83836365-0565-69cc-ff4a-1ff66ad8daa3 (Addr: tcp/127.0.0.1:34072) (DC: dc1)
test - 2019/12/06 06:01:23.891905 [INFO] consul: Handled member-join event for server "Node 83836365-0565-69cc-ff4a-1ff66ad8daa3.dc1" in area "wan"
test - 2019/12/06 06:01:23.893720 [INFO] agent: Started DNS server 127.0.0.1:34067 (tcp)
test - 2019/12/06 06:01:23.894736 [INFO] agent: Started DNS server 127.0.0.1:34067 (udp)
test - 2019/12/06 06:01:23.897276 [INFO] agent: Started HTTP server on 127.0.0.1:34068 (tcp)
test - 2019/12/06 06:01:23.897363 [INFO] agent: started state syncer
2019/12/06 06:01:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:23 [INFO]  raft: Node at 127.0.0.1:34072 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:24 [INFO]  raft: Node at 127.0.0.1:34072 [Leader] entering Leader state
test - 2019/12/06 06:01:24.466666 [INFO] consul: cluster leadership acquired
test - 2019/12/06 06:01:24.467065 [INFO] consul: New leader elected: Node 83836365-0565-69cc-ff4a-1ff66ad8daa3
test - 2019/12/06 06:01:24.526928 [INFO] agent: Requesting shutdown
test - 2019/12/06 06:01:24.527036 [INFO] consul: shutting down server
test - 2019/12/06 06:01:24.527103 [WARN] serf: Shutdown without a Leave
test - 2019/12/06 06:01:24.527339 [ERR] agent: failed to sync remote state: No cluster leader
test - 2019/12/06 06:01:24.691331 [WARN] serf: Shutdown without a Leave
test - 2019/12/06 06:01:24.784735 [INFO] manager: shutting down
test - 2019/12/06 06:01:24.859816 [INFO] agent: consul server down
test - 2019/12/06 06:01:24.859892 [INFO] agent: shutdown complete
test - 2019/12/06 06:01:24.859946 [INFO] agent: Stopping DNS server 127.0.0.1:34067 (tcp)
test - 2019/12/06 06:01:24.860099 [INFO] agent: Stopping DNS server 127.0.0.1:34067 (udp)
test - 2019/12/06 06:01:24.860237 [INFO] agent: Stopping HTTP server 127.0.0.1:34068 (tcp)
test - 2019/12/06 06:01:24.860453 [INFO] agent: Waiting for endpoints to shut down
test - 2019/12/06 06:01:24.860517 [INFO] agent: Endpoints down
=== RUN   TestAgent_ConnectClusterIDConfig/non-UUID_cluster_id_is_fatal
test - 2019/12/06 06:01:24.862523 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/12/06 06:01:24.928201 [WARN] agent: Node name "Node 119fa935-aa5c-341d-7cc3-70e941235d92" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/12/06 06:01:24.928542 [ERR] connect CA config cluster_id specified but is not a valid UUID, aborting startup
--- PASS: TestAgent_ConnectClusterIDConfig (4.58s)
    --- PASS: TestAgent_ConnectClusterIDConfig/default_TestAgent_has_fixed_cluster_id (2.05s)
    --- PASS: TestAgent_ConnectClusterIDConfig/no_cluster_ID_specified_sets_to_test_ID (2.46s)
    --- PASS: TestAgent_ConnectClusterIDConfig/non-UUID_cluster_id_is_fatal (0.07s)
=== RUN   TestAgent_StartStop
=== PAUSE TestAgent_StartStop
=== RUN   TestAgent_RPCPing
=== PAUSE TestAgent_RPCPing
=== RUN   TestAgent_TokenStore
=== PAUSE TestAgent_TokenStore
=== RUN   TestAgent_ReconnectConfigSettings
=== PAUSE TestAgent_ReconnectConfigSettings
=== RUN   TestAgent_ReconnectConfigWanDisabled
=== PAUSE TestAgent_ReconnectConfigWanDisabled
=== RUN   TestAgent_setupNodeID
=== PAUSE TestAgent_setupNodeID
=== RUN   TestAgent_makeNodeID
=== PAUSE TestAgent_makeNodeID
=== RUN   TestAgent_AddService
=== PAUSE TestAgent_AddService
=== RUN   TestAgent_AddServiceNoExec
=== PAUSE TestAgent_AddServiceNoExec
=== RUN   TestAgent_AddServiceNoRemoteExec
=== PAUSE TestAgent_AddServiceNoRemoteExec
=== RUN   TestAgent_RemoveService
=== PAUSE TestAgent_RemoveService
=== RUN   TestAgent_RemoveServiceRemovesAllChecks
=== PAUSE TestAgent_RemoveServiceRemovesAllChecks
=== RUN   TestAgent_IndexChurn
=== PAUSE TestAgent_IndexChurn
=== RUN   TestAgent_AddCheck
=== PAUSE TestAgent_AddCheck
=== RUN   TestAgent_AddCheck_StartPassing
=== PAUSE TestAgent_AddCheck_StartPassing
=== RUN   TestAgent_AddCheck_MinInterval
=== PAUSE TestAgent_AddCheck_MinInterval
=== RUN   TestAgent_AddCheck_MissingService
=== PAUSE TestAgent_AddCheck_MissingService
=== RUN   TestAgent_AddCheck_RestoreState
=== PAUSE TestAgent_AddCheck_RestoreState
=== RUN   TestAgent_AddCheck_ExecDisable
=== PAUSE TestAgent_AddCheck_ExecDisable
=== RUN   TestAgent_AddCheck_ExecRemoteDisable
=== PAUSE TestAgent_AddCheck_ExecRemoteDisable
=== RUN   TestAgent_AddCheck_GRPC
=== PAUSE TestAgent_AddCheck_GRPC
=== RUN   TestAgent_RestoreServiceWithAliasCheck
--- SKIP: TestAgent_RestoreServiceWithAliasCheck (0.00s)
    agent_test.go:1149: skipping slow test; set SLOWTEST=1 to run
=== RUN   TestAgent_AddCheck_Alias
=== PAUSE TestAgent_AddCheck_Alias
=== RUN   TestAgent_AddCheck_Alias_setToken
=== PAUSE TestAgent_AddCheck_Alias_setToken
=== RUN   TestAgent_AddCheck_Alias_userToken
=== PAUSE TestAgent_AddCheck_Alias_userToken
=== RUN   TestAgent_AddCheck_Alias_userAndSetToken
=== PAUSE TestAgent_AddCheck_Alias_userAndSetToken
=== RUN   TestAgent_RemoveCheck
=== PAUSE TestAgent_RemoveCheck
=== RUN   TestAgent_HTTPCheck_TLSSkipVerify
=== PAUSE TestAgent_HTTPCheck_TLSSkipVerify
=== RUN   TestAgent_HTTPCheck_EnableAgentTLSForChecks
--- SKIP: TestAgent_HTTPCheck_EnableAgentTLSForChecks (0.00s)
    agent_test.go:1521: DM-skipped
=== RUN   TestAgent_updateTTLCheck
=== PAUSE TestAgent_updateTTLCheck
=== RUN   TestAgent_PersistService
=== PAUSE TestAgent_PersistService
=== RUN   TestAgent_persistedService_compat
=== PAUSE TestAgent_persistedService_compat
=== RUN   TestAgent_PurgeService
=== PAUSE TestAgent_PurgeService
=== RUN   TestAgent_PurgeServiceOnDuplicate
=== PAUSE TestAgent_PurgeServiceOnDuplicate
=== RUN   TestAgent_PersistProxy
=== PAUSE TestAgent_PersistProxy
=== RUN   TestAgent_PurgeProxy
=== PAUSE TestAgent_PurgeProxy
=== RUN   TestAgent_PurgeProxyOnDuplicate
=== PAUSE TestAgent_PurgeProxyOnDuplicate
=== RUN   TestAgent_PersistCheck
=== PAUSE TestAgent_PersistCheck
=== RUN   TestAgent_PurgeCheck
--- SKIP: TestAgent_PurgeCheck (0.00s)
    agent_test.go:2146: DM-skipped
=== RUN   TestAgent_PurgeCheckOnDuplicate
=== PAUSE TestAgent_PurgeCheckOnDuplicate
=== RUN   TestAgent_loadChecks_token
=== PAUSE TestAgent_loadChecks_token
=== RUN   TestAgent_unloadChecks
=== PAUSE TestAgent_unloadChecks
=== RUN   TestAgent_loadServices_token
=== PAUSE TestAgent_loadServices_token
=== RUN   TestAgent_loadServices_sidecar
=== PAUSE TestAgent_loadServices_sidecar
=== RUN   TestAgent_loadServices_sidecarSeparateToken
=== PAUSE TestAgent_loadServices_sidecarSeparateToken
=== RUN   TestAgent_loadServices_sidecarInheritMeta
=== PAUSE TestAgent_loadServices_sidecarInheritMeta
=== RUN   TestAgent_loadServices_sidecarOverrideMeta
=== PAUSE TestAgent_loadServices_sidecarOverrideMeta
=== RUN   TestAgent_unloadServices
=== PAUSE TestAgent_unloadServices
=== RUN   TestAgent_loadProxies
=== PAUSE TestAgent_loadProxies
=== RUN   TestAgent_loadProxies_nilProxy
=== PAUSE TestAgent_loadProxies_nilProxy
=== RUN   TestAgent_unloadProxies
=== PAUSE TestAgent_unloadProxies
=== RUN   TestAgent_Service_MaintenanceMode
=== PAUSE TestAgent_Service_MaintenanceMode
=== RUN   TestAgent_Service_Reap
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Service_Reap - 2019/12/06 06:01:25.011136 [WARN] agent: Node name "Node ee27c38a-9b9f-502f-335e-08ede8e10f2e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Service_Reap - 2019/12/06 06:01:25.011611 [DEBUG] tlsutil: Update with version 1
TestAgent_Service_Reap - 2019/12/06 06:01:25.013937 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ee27c38a-9b9f-502f-335e-08ede8e10f2e Address:127.0.0.1:34084}]
2019/12/06 06:01:25 [INFO]  raft: Node at 127.0.0.1:34084 [Follower] entering Follower state (Leader: "")
TestAgent_Service_Reap - 2019/12/06 06:01:25.736886 [INFO] serf: EventMemberJoin: Node ee27c38a-9b9f-502f-335e-08ede8e10f2e.dc1 127.0.0.1
TestAgent_Service_Reap - 2019/12/06 06:01:25.746879 [INFO] serf: EventMemberJoin: Node ee27c38a-9b9f-502f-335e-08ede8e10f2e 127.0.0.1
TestAgent_Service_Reap - 2019/12/06 06:01:25.748529 [INFO] agent: Started DNS server 127.0.0.1:34079 (udp)
TestAgent_Service_Reap - 2019/12/06 06:01:25.751046 [INFO] consul: Handled member-join event for server "Node ee27c38a-9b9f-502f-335e-08ede8e10f2e.dc1" in area "wan"
TestAgent_Service_Reap - 2019/12/06 06:01:25.757213 [INFO] consul: Adding LAN server Node ee27c38a-9b9f-502f-335e-08ede8e10f2e (Addr: tcp/127.0.0.1:34084) (DC: dc1)
TestAgent_Service_Reap - 2019/12/06 06:01:25.759753 [INFO] agent: Started DNS server 127.0.0.1:34079 (tcp)
TestAgent_Service_Reap - 2019/12/06 06:01:25.778971 [INFO] agent: Started HTTP server on 127.0.0.1:34080 (tcp)
TestAgent_Service_Reap - 2019/12/06 06:01:25.779104 [INFO] agent: started state syncer
2019/12/06 06:01:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:25 [INFO]  raft: Node at 127.0.0.1:34084 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:26 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:26 [INFO]  raft: Node at 127.0.0.1:34084 [Leader] entering Leader state
TestAgent_Service_Reap - 2019/12/06 06:01:26.276137 [INFO] consul: cluster leadership acquired
TestAgent_Service_Reap - 2019/12/06 06:01:26.276600 [INFO] consul: New leader elected: Node ee27c38a-9b9f-502f-335e-08ede8e10f2e
TestAgent_Service_Reap - 2019/12/06 06:01:26.783849 [INFO] agent: Synced node info
TestAgent_Service_Reap - 2019/12/06 06:01:26.784052 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/06 06:01:26.962597 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/06 06:01:27.721077 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_Service_Reap - 2019/12/06 06:01:27.724816 [DEBUG] consul: Skipping self join check for "Node ee27c38a-9b9f-502f-335e-08ede8e10f2e" since the cluster is too small
TestAgent_Service_Reap - 2019/12/06 06:01:27.725035 [INFO] consul: member 'Node ee27c38a-9b9f-502f-335e-08ede8e10f2e' joined, marking health alive
TestAgent_Service_Reap - 2019/12/06 06:01:27.928964 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_Reap - 2019/12/06 06:01:28.060178 [INFO] agent: Synced service "redis"
TestAgent_Service_Reap - 2019/12/06 06:01:28.060298 [DEBUG] agent: Check "service:redis" in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.060342 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.060580 [DEBUG] agent: Check "service:redis" status is now passing
TestAgent_Service_Reap - 2019/12/06 06:01:28.060640 [DEBUG] agent: Service "redis" in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.218037 [INFO] agent: Synced check "service:redis"
TestAgent_Service_Reap - 2019/12/06 06:01:28.218288 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.218727 [DEBUG] agent: Service "redis" in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.243941 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_Reap - 2019/12/06 06:01:28.368162 [INFO] agent: Synced check "service:redis"
TestAgent_Service_Reap - 2019/12/06 06:01:28.368405 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.368700 [DEBUG] agent: Service "redis" in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.520020 [INFO] agent: Synced check "service:redis"
TestAgent_Service_Reap - 2019/12/06 06:01:28.520355 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.736523 [INFO] agent: Deregistered service "redis"
TestAgent_Service_Reap - 2019/12/06 06:01:28.891715 [INFO] agent: Deregistered check "service:redis"
TestAgent_Service_Reap - 2019/12/06 06:01:28.891814 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.891935 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/12/06 06:01:28.892783 [DEBUG] agent: removed check "service:redis"
TestAgent_Service_Reap - 2019/12/06 06:01:28.892853 [DEBUG] agent: removed service "redis"
TestAgent_Service_Reap - 2019/12/06 06:01:28.892904 [INFO] agent: Check "service:redis" for service "redis" has been critical for too long; deregistered service
TestAgent_Service_Reap - 2019/12/06 06:01:28.921006 [INFO] agent: Requesting shutdown
TestAgent_Service_Reap - 2019/12/06 06:01:28.921109 [INFO] consul: shutting down server
TestAgent_Service_Reap - 2019/12/06 06:01:28.921173 [WARN] serf: Shutdown without a Leave
TestAgent_Service_Reap - 2019/12/06 06:01:28.974591 [WARN] serf: Shutdown without a Leave
TestAgent_Service_Reap - 2019/12/06 06:01:29.050138 [INFO] manager: shutting down
TestAgent_Service_Reap - 2019/12/06 06:01:29.050566 [INFO] agent: consul server down
TestAgent_Service_Reap - 2019/12/06 06:01:29.050613 [INFO] agent: shutdown complete
TestAgent_Service_Reap - 2019/12/06 06:01:29.050667 [INFO] agent: Stopping DNS server 127.0.0.1:34079 (tcp)
TestAgent_Service_Reap - 2019/12/06 06:01:29.050797 [INFO] agent: Stopping DNS server 127.0.0.1:34079 (udp)
TestAgent_Service_Reap - 2019/12/06 06:01:29.050943 [INFO] agent: Stopping HTTP server 127.0.0.1:34080 (tcp)
TestAgent_Service_Reap - 2019/12/06 06:01:29.051123 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Service_Reap - 2019/12/06 06:01:29.051190 [INFO] agent: Endpoints down
--- PASS: TestAgent_Service_Reap (4.11s)
=== RUN   TestAgent_Service_NoReap
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Service_NoReap - 2019/12/06 06:01:29.198812 [WARN] agent: Node name "Node f56c9872-3c00-60f8-59cf-9e7edc26188d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Service_NoReap - 2019/12/06 06:01:29.199409 [DEBUG] tlsutil: Update with version 1
TestAgent_Service_NoReap - 2019/12/06 06:01:29.201632 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f56c9872-3c00-60f8-59cf-9e7edc26188d Address:127.0.0.1:34090}]
2019/12/06 06:01:29 [INFO]  raft: Node at 127.0.0.1:34090 [Follower] entering Follower state (Leader: "")
TestAgent_Service_NoReap - 2019/12/06 06:01:29.983193 [INFO] serf: EventMemberJoin: Node f56c9872-3c00-60f8-59cf-9e7edc26188d.dc1 127.0.0.1
TestAgent_Service_NoReap - 2019/12/06 06:01:29.989791 [INFO] serf: EventMemberJoin: Node f56c9872-3c00-60f8-59cf-9e7edc26188d 127.0.0.1
TestAgent_Service_NoReap - 2019/12/06 06:01:29.991795 [INFO] consul: Adding LAN server Node f56c9872-3c00-60f8-59cf-9e7edc26188d (Addr: tcp/127.0.0.1:34090) (DC: dc1)
TestAgent_Service_NoReap - 2019/12/06 06:01:29.992731 [INFO] consul: Handled member-join event for server "Node f56c9872-3c00-60f8-59cf-9e7edc26188d.dc1" in area "wan"
TestAgent_Service_NoReap - 2019/12/06 06:01:29.995019 [INFO] agent: Started DNS server 127.0.0.1:34085 (tcp)
TestAgent_Service_NoReap - 2019/12/06 06:01:29.995849 [INFO] agent: Started DNS server 127.0.0.1:34085 (udp)
TestAgent_Service_NoReap - 2019/12/06 06:01:29.998701 [INFO] agent: Started HTTP server on 127.0.0.1:34086 (tcp)
TestAgent_Service_NoReap - 2019/12/06 06:01:29.998819 [INFO] agent: started state syncer
2019/12/06 06:01:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:30 [INFO]  raft: Node at 127.0.0.1:34090 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:30 [INFO]  raft: Node at 127.0.0.1:34090 [Leader] entering Leader state
TestAgent_Service_NoReap - 2019/12/06 06:01:30.491940 [INFO] consul: cluster leadership acquired
TestAgent_Service_NoReap - 2019/12/06 06:01:30.492391 [INFO] consul: New leader elected: Node f56c9872-3c00-60f8-59cf-9e7edc26188d
TestAgent_Service_NoReap - 2019/12/06 06:01:30.734069 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_NoReap - 2019/12/06 06:01:30.943859 [INFO] agent: Synced service "redis"
TestAgent_Service_NoReap - 2019/12/06 06:01:30.943960 [DEBUG] agent: Check "service:redis" in sync
TestAgent_Service_NoReap - 2019/12/06 06:01:30.943999 [DEBUG] agent: Node info in sync
TestAgent_Service_NoReap - 2019/12/06 06:01:30.944145 [DEBUG] agent: Service "redis" in sync
TestAgent_Service_NoReap - 2019/12/06 06:01:30.944273 [DEBUG] agent: Check "service:redis" in sync
TestAgent_Service_NoReap - 2019/12/06 06:01:30.944309 [DEBUG] agent: Node info in sync
TestAgent_Service_NoReap - 2019/12/06 06:01:31.144885 [INFO] agent: Requesting shutdown
TestAgent_Service_NoReap - 2019/12/06 06:01:31.145010 [INFO] consul: shutting down server
TestAgent_Service_NoReap - 2019/12/06 06:01:31.145062 [WARN] serf: Shutdown without a Leave
TestAgent_Service_NoReap - 2019/12/06 06:01:31.233053 [WARN] serf: Shutdown without a Leave
TestAgent_Service_NoReap - 2019/12/06 06:01:31.366397 [INFO] manager: shutting down
TestAgent_Service_NoReap - 2019/12/06 06:01:31.425032 [INFO] agent: consul server down
TestAgent_Service_NoReap - 2019/12/06 06:01:31.425115 [INFO] agent: shutdown complete
TestAgent_Service_NoReap - 2019/12/06 06:01:31.425183 [INFO] agent: Stopping DNS server 127.0.0.1:34085 (tcp)
TestAgent_Service_NoReap - 2019/12/06 06:01:31.425348 [INFO] agent: Stopping DNS server 127.0.0.1:34085 (udp)
TestAgent_Service_NoReap - 2019/12/06 06:01:31.425534 [INFO] agent: Stopping HTTP server 127.0.0.1:34086 (tcp)
TestAgent_Service_NoReap - 2019/12/06 06:01:31.425803 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Service_NoReap - 2019/12/06 06:01:31.425882 [INFO] agent: Endpoints down
--- PASS: TestAgent_Service_NoReap (2.37s)
=== RUN   TestAgent_AddService_restoresSnapshot
=== PAUSE TestAgent_AddService_restoresSnapshot
=== RUN   TestAgent_AddCheck_restoresSnapshot
=== PAUSE TestAgent_AddCheck_restoresSnapshot
=== RUN   TestAgent_NodeMaintenanceMode
=== PAUSE TestAgent_NodeMaintenanceMode
=== RUN   TestAgent_checkStateSnapshot
=== PAUSE TestAgent_checkStateSnapshot
=== RUN   TestAgent_loadChecks_checkFails
=== PAUSE TestAgent_loadChecks_checkFails
=== RUN   TestAgent_persistCheckState
=== PAUSE TestAgent_persistCheckState
=== RUN   TestAgent_loadCheckState
=== PAUSE TestAgent_loadCheckState
=== RUN   TestAgent_purgeCheckState
=== PAUSE TestAgent_purgeCheckState
=== RUN   TestAgent_GetCoordinate
=== PAUSE TestAgent_GetCoordinate
=== RUN   TestAgent_reloadWatches
=== PAUSE TestAgent_reloadWatches
=== RUN   TestAgent_reloadWatchesHTTPS
=== PAUSE TestAgent_reloadWatchesHTTPS
=== RUN   TestAgent_AddProxy
--- SKIP: TestAgent_AddProxy (0.00s)
    agent_test.go:3265: DM-skipped
=== RUN   TestAgent_RemoveProxy
=== PAUSE TestAgent_RemoveProxy
=== RUN   TestAgent_ReLoadProxiesFromConfig
=== PAUSE TestAgent_ReLoadProxiesFromConfig
=== RUN   TestAgent_SetupProxyManager
=== PAUSE TestAgent_SetupProxyManager
=== RUN   TestAgent_loadTokens
=== PAUSE TestAgent_loadTokens
=== RUN   TestAgent_ReloadConfigOutgoingRPCConfig
=== PAUSE TestAgent_ReloadConfigOutgoingRPCConfig
=== RUN   TestAgent_ReloadConfigIncomingRPCConfig
=== PAUSE TestAgent_ReloadConfigIncomingRPCConfig
=== RUN   TestAgent_ReloadConfigTLSConfigFailure
=== PAUSE TestAgent_ReloadConfigTLSConfigFailure
=== RUN   TestAgent_consulConfig
=== PAUSE TestAgent_consulConfig
=== RUN   TestBlacklist
=== PAUSE TestBlacklist
=== RUN   TestCatalogRegister_Service_InvalidAddress
=== PAUSE TestCatalogRegister_Service_InvalidAddress
=== RUN   TestCatalogDeregister
TestAgent_Service_NoReap - 2019/12/06 06:01:31.429352 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_Service_NoReap - 2019/12/06 06:01:31.429560 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
=== PAUSE TestCatalogDeregister
TestAgent_Service_NoReap - 2019/12/06 06:01:31.429629 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
=== RUN   TestCatalogDatacenters
=== PAUSE TestCatalogDatacenters
=== RUN   TestCatalogNodes
=== PAUSE TestCatalogNodes
=== RUN   TestCatalogNodes_MetaFilter
=== PAUSE TestCatalogNodes_MetaFilter
=== RUN   TestCatalogNodes_Filter
=== PAUSE TestCatalogNodes_Filter
=== RUN   TestCatalogNodes_WanTranslation
--- SKIP: TestCatalogNodes_WanTranslation (0.00s)
    catalog_endpoint_test.go:194: DM-skipped
=== RUN   TestCatalogNodes_Blocking
=== PAUSE TestCatalogNodes_Blocking
=== RUN   TestCatalogNodes_DistanceSort
=== PAUSE TestCatalogNodes_DistanceSort
=== RUN   TestCatalogServices
=== PAUSE TestCatalogServices
=== RUN   TestCatalogServices_NodeMetaFilter
=== PAUSE TestCatalogServices_NodeMetaFilter
=== RUN   TestCatalogServiceNodes
=== PAUSE TestCatalogServiceNodes
=== RUN   TestCatalogServiceNodes_NodeMetaFilter
=== PAUSE TestCatalogServiceNodes_NodeMetaFilter
=== RUN   TestCatalogServiceNodes_Filter
=== PAUSE TestCatalogServiceNodes_Filter
=== RUN   TestCatalogServiceNodes_WanTranslation
--- SKIP: TestCatalogServiceNodes_WanTranslation (0.00s)
    catalog_endpoint_test.go:756: DM-skipped
=== RUN   TestCatalogServiceNodes_DistanceSort
=== PAUSE TestCatalogServiceNodes_DistanceSort
=== RUN   TestCatalogServiceNodes_ConnectProxy
=== PAUSE TestCatalogServiceNodes_ConnectProxy
=== RUN   TestCatalogConnectServiceNodes_good
=== PAUSE TestCatalogConnectServiceNodes_good
=== RUN   TestCatalogConnectServiceNodes_Filter
=== PAUSE TestCatalogConnectServiceNodes_Filter
=== RUN   TestCatalogNodeServices
=== PAUSE TestCatalogNodeServices
=== RUN   TestCatalogNodeServices_Filter
=== PAUSE TestCatalogNodeServices_Filter
=== RUN   TestCatalogNodeServices_ConnectProxy
=== PAUSE TestCatalogNodeServices_ConnectProxy
=== RUN   TestCatalogNodeServices_WanTranslation
--- SKIP: TestCatalogNodeServices_WanTranslation (0.00s)
    catalog_endpoint_test.go:1136: DM-skipped
=== RUN   TestConfig_Get
=== PAUSE TestConfig_Get
=== RUN   TestConfig_Delete
=== PAUSE TestConfig_Delete
=== RUN   TestConfig_Apply
=== PAUSE TestConfig_Apply
=== RUN   TestConfig_Apply_CAS
=== PAUSE TestConfig_Apply_CAS
=== RUN   TestConfig_Apply_Decoding
=== PAUSE TestConfig_Apply_Decoding
=== RUN   TestConnectCARoots_empty
=== PAUSE TestConnectCARoots_empty
=== RUN   TestConnectCARoots_list
=== PAUSE TestConnectCARoots_list
=== RUN   TestConnectCAConfig
=== PAUSE TestConnectCAConfig
=== RUN   TestCoordinate_Disabled_Response
=== PAUSE TestCoordinate_Disabled_Response
=== RUN   TestCoordinate_Datacenters
--- SKIP: TestCoordinate_Datacenters (0.00s)
    coordinate_endpoint_test.go:54: DM-skipped
=== RUN   TestCoordinate_Nodes
=== PAUSE TestCoordinate_Nodes
=== RUN   TestCoordinate_Node
=== PAUSE TestCoordinate_Node
=== RUN   TestCoordinate_Update
=== PAUSE TestCoordinate_Update
=== RUN   TestCoordinate_Update_ACLDeny
=== PAUSE TestCoordinate_Update_ACLDeny
=== RUN   TestRecursorAddr
=== PAUSE TestRecursorAddr
=== RUN   TestEncodeKVasRFC1464
--- PASS: TestEncodeKVasRFC1464 (0.00s)
=== RUN   TestDNS_Over_TCP
=== PAUSE TestDNS_Over_TCP
=== RUN   TestDNS_NodeLookup
--- SKIP: TestDNS_NodeLookup (0.01s)
    dns_test.go:178: DM-skipped
=== RUN   TestDNS_CaseInsensitiveNodeLookup
=== PAUSE TestDNS_CaseInsensitiveNodeLookup
=== RUN   TestDNS_NodeLookup_PeriodName
=== PAUSE TestDNS_NodeLookup_PeriodName
=== RUN   TestDNS_NodeLookup_AAAA
=== PAUSE TestDNS_NodeLookup_AAAA
=== RUN   TestDNSCycleRecursorCheck
=== PAUSE TestDNSCycleRecursorCheck
=== RUN   TestDNSCycleRecursorCheckAllFail
--- SKIP: TestDNSCycleRecursorCheckAllFail (0.00s)
    dns_test.go:423: DM-skipped
=== RUN   TestDNS_NodeLookup_CNAME
=== PAUSE TestDNS_NodeLookup_CNAME
=== RUN   TestDNS_NodeLookup_TXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:31.580070 [WARN] agent: Node name "Node f4126c14-715e-714f-1931-807da0ffba22" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:31.580586 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:31.582959 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f4126c14-715e-714f-1931-807da0ffba22 Address:127.0.0.1:34096}]
2019/12/06 06:01:32 [INFO]  raft: Node at 127.0.0.1:34096 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.336808 [INFO] serf: EventMemberJoin: Node f4126c14-715e-714f-1931-807da0ffba22.dc1 127.0.0.1
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.346922 [INFO] serf: EventMemberJoin: Node f4126c14-715e-714f-1931-807da0ffba22 127.0.0.1
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.359361 [INFO] consul: Adding LAN server Node f4126c14-715e-714f-1931-807da0ffba22 (Addr: tcp/127.0.0.1:34096) (DC: dc1)
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.360348 [INFO] consul: Handled member-join event for server "Node f4126c14-715e-714f-1931-807da0ffba22.dc1" in area "wan"
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.362299 [INFO] agent: Started DNS server 127.0.0.1:34091 (udp)
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.362874 [INFO] agent: Started DNS server 127.0.0.1:34091 (tcp)
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.367056 [INFO] agent: Started HTTP server on 127.0.0.1:34092 (tcp)
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.367160 [INFO] agent: started state syncer
2019/12/06 06:01:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:32 [INFO]  raft: Node at 127.0.0.1:34096 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:32 [INFO]  raft: Node at 127.0.0.1:34096 [Leader] entering Leader state
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.834081 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:32.834665 [INFO] consul: New leader elected: Node f4126c14-715e-714f-1931-807da0ffba22
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.144042 [INFO] agent: Synced node info
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.444748 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.444921 [INFO] consul: shutting down server
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.444984 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.445015 [DEBUG] dns: request for name google.node.consul. type TXT class IN (took 1.082691ms) from client 127.0.0.1:57132 (udp)
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.558049 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.683111 [INFO] manager: shutting down
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.749712 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.750013 [INFO] agent: consul server down
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.750073 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.750134 [INFO] agent: Stopping DNS server 127.0.0.1:34091 (tcp)
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.750301 [INFO] agent: Stopping DNS server 127.0.0.1:34091 (udp)
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.750481 [INFO] agent: Stopping HTTP server 127.0.0.1:34092 (tcp)
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.750704 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TXT - 2019/12/06 06:01:33.750779 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TXT (2.30s)
=== RUN   TestDNS_NodeLookup_TXT_DontSuppress
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:33.905667 [WARN] agent: Node name "Node 49ea2b26-0f4a-bf91-f44c-3bba256d23cd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:33.906262 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:33.908624 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:49ea2b26-0f4a-bf91-f44c-3bba256d23cd Address:127.0.0.1:34102}]
2019/12/06 06:01:34 [INFO]  raft: Node at 127.0.0.1:34102 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:34.801449 [INFO] serf: EventMemberJoin: Node 49ea2b26-0f4a-bf91-f44c-3bba256d23cd.dc1 127.0.0.1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:34.805496 [INFO] serf: EventMemberJoin: Node 49ea2b26-0f4a-bf91-f44c-3bba256d23cd 127.0.0.1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:34.806460 [INFO] consul: Adding LAN server Node 49ea2b26-0f4a-bf91-f44c-3bba256d23cd (Addr: tcp/127.0.0.1:34102) (DC: dc1)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:34.806817 [INFO] consul: Handled member-join event for server "Node 49ea2b26-0f4a-bf91-f44c-3bba256d23cd.dc1" in area "wan"
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:34.806885 [INFO] agent: Started DNS server 127.0.0.1:34097 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:34.807242 [INFO] agent: Started DNS server 127.0.0.1:34097 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:34.809571 [INFO] agent: Started HTTP server on 127.0.0.1:34098 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:34.809667 [INFO] agent: started state syncer
2019/12/06 06:01:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:34 [INFO]  raft: Node at 127.0.0.1:34102 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:35 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:35 [INFO]  raft: Node at 127.0.0.1:34102 [Leader] entering Leader state
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:35.291773 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:35.292205 [INFO] consul: New leader elected: Node 49ea2b26-0f4a-bf91-f44c-3bba256d23cd
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:35.592431 [INFO] agent: Synced node info
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:35.592563 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.172403 [DEBUG] dns: request for name google.node.consul. type TXT class IN (took 1.136693ms) from client 127.0.0.1:41096 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.172560 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.172684 [INFO] consul: shutting down server
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.172777 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.424684 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.508149 [INFO] manager: shutting down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.600038 [INFO] agent: consul server down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.600123 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.600177 [INFO] agent: Stopping DNS server 127.0.0.1:34097 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.600307 [INFO] agent: Stopping DNS server 127.0.0.1:34097 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.600452 [INFO] agent: Stopping HTTP server 127.0.0.1:34098 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.600643 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.600708 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TXT_DontSuppress (2.85s)
=== RUN   TestDNS_NodeLookup_ANY
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.604382 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_NodeLookup_TXT_DontSuppress - 2019/12/06 06:01:36.604601 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:36.686696 [WARN] agent: Node name "Node 8fe0cf4f-0339-a9c5-35bc-8327c29c4f05" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:36.687181 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:36.689474 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8fe0cf4f-0339-a9c5-35bc-8327c29c4f05 Address:127.0.0.1:34108}]
2019/12/06 06:01:37 [INFO]  raft: Node at 127.0.0.1:34108 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:37.745744 [INFO] serf: EventMemberJoin: Node 8fe0cf4f-0339-a9c5-35bc-8327c29c4f05.dc1 127.0.0.1
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:37.749032 [INFO] serf: EventMemberJoin: Node 8fe0cf4f-0339-a9c5-35bc-8327c29c4f05 127.0.0.1
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:37.749710 [INFO] consul: Adding LAN server Node 8fe0cf4f-0339-a9c5-35bc-8327c29c4f05 (Addr: tcp/127.0.0.1:34108) (DC: dc1)
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:37.749830 [INFO] consul: Handled member-join event for server "Node 8fe0cf4f-0339-a9c5-35bc-8327c29c4f05.dc1" in area "wan"
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:37.750342 [INFO] agent: Started DNS server 127.0.0.1:34103 (tcp)
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:37.750425 [INFO] agent: Started DNS server 127.0.0.1:34103 (udp)
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:37.752864 [INFO] agent: Started HTTP server on 127.0.0.1:34104 (tcp)
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:37.752968 [INFO] agent: started state syncer
2019/12/06 06:01:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:37 [INFO]  raft: Node at 127.0.0.1:34108 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:38 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:38 [INFO]  raft: Node at 127.0.0.1:34108 [Leader] entering Leader state
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:38.316995 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:38.317477 [INFO] consul: New leader elected: Node 8fe0cf4f-0339-a9c5-35bc-8327c29c4f05
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:38.610923 [INFO] agent: Synced node info
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:38.914388 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 1.115026ms) from client 127.0.0.1:37539 (udp)
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:38.914913 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:38.915131 [INFO] consul: shutting down server
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:38.915306 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.158651 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.625024 [INFO] manager: shutting down
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.725051 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.725719 [INFO] agent: consul server down
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.725926 [INFO] agent: shutdown complete
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.726123 [INFO] agent: Stopping DNS server 127.0.0.1:34103 (tcp)
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.726547 [INFO] agent: Stopping DNS server 127.0.0.1:34103 (udp)
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.727053 [INFO] agent: Stopping HTTP server 127.0.0.1:34104 (tcp)
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.727632 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_ANY - 2019/12/06 06:01:39.727824 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_ANY (3.13s)
=== RUN   TestDNS_NodeLookup_ANY_DontSuppressTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:39.850487 [WARN] agent: Node name "Node efa34b8b-28eb-6499-efea-baf266d157af" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:39.850954 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:39.853435 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:efa34b8b-28eb-6499-efea-baf266d157af Address:127.0.0.1:34114}]
2019/12/06 06:01:40 [INFO]  raft: Node at 127.0.0.1:34114 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:40.603833 [INFO] serf: EventMemberJoin: Node efa34b8b-28eb-6499-efea-baf266d157af.dc1 127.0.0.1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:40.607237 [INFO] serf: EventMemberJoin: Node efa34b8b-28eb-6499-efea-baf266d157af 127.0.0.1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:40.608193 [INFO] consul: Adding LAN server Node efa34b8b-28eb-6499-efea-baf266d157af (Addr: tcp/127.0.0.1:34114) (DC: dc1)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:40.608791 [INFO] agent: Started DNS server 127.0.0.1:34109 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:40.609252 [INFO] agent: Started DNS server 127.0.0.1:34109 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:40.616621 [INFO] agent: Started HTTP server on 127.0.0.1:34110 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:40.616750 [INFO] agent: started state syncer
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:40.609057 [INFO] consul: Handled member-join event for server "Node efa34b8b-28eb-6499-efea-baf266d157af.dc1" in area "wan"
2019/12/06 06:01:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:40 [INFO]  raft: Node at 127.0.0.1:34114 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:41 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:41 [INFO]  raft: Node at 127.0.0.1:34114 [Leader] entering Leader state
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:41.148855 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:41.149360 [INFO] consul: New leader elected: Node efa34b8b-28eb-6499-efea-baf266d157af
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:42.809003 [INFO] agent: Synced node info
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:42.809124 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.093953 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 639.015µs) from client 127.0.0.1:41835 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.094320 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.094418 [INFO] consul: shutting down server
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.094463 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.224846 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.366576 [INFO] manager: shutting down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.424923 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.425133 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.425163 [INFO] agent: consul server down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.425236 [INFO] agent: shutdown complete
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.425300 [INFO] agent: Stopping DNS server 127.0.0.1:34109 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.425429 [INFO] agent: Stopping DNS server 127.0.0.1:34109 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.425596 [INFO] agent: Stopping HTTP server 127.0.0.1:34110 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.425800 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/12/06 06:01:43.425861 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_ANY_DontSuppressTXT (3.70s)
=== RUN   TestDNS_NodeLookup_A_SuppressTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:43.486208 [WARN] agent: Node name "Node 85760075-3a4b-8eb6-49fc-124c628165e3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:43.486829 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:43.489091 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:85760075-3a4b-8eb6-49fc-124c628165e3 Address:127.0.0.1:34120}]
2019/12/06 06:01:44 [INFO]  raft: Node at 127.0.0.1:34120 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:44.229063 [INFO] serf: EventMemberJoin: Node 85760075-3a4b-8eb6-49fc-124c628165e3.dc1 127.0.0.1
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:44.238866 [INFO] serf: EventMemberJoin: Node 85760075-3a4b-8eb6-49fc-124c628165e3 127.0.0.1
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:44.239929 [INFO] consul: Adding LAN server Node 85760075-3a4b-8eb6-49fc-124c628165e3 (Addr: tcp/127.0.0.1:34120) (DC: dc1)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:44.240030 [INFO] consul: Handled member-join event for server "Node 85760075-3a4b-8eb6-49fc-124c628165e3.dc1" in area "wan"
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:44.241459 [INFO] agent: Started DNS server 127.0.0.1:34115 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:44.242163 [INFO] agent: Started DNS server 127.0.0.1:34115 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:44.244729 [INFO] agent: Started HTTP server on 127.0.0.1:34116 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:44.244878 [INFO] agent: started state syncer
2019/12/06 06:01:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:44 [INFO]  raft: Node at 127.0.0.1:34120 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:45 [INFO]  raft: Node at 127.0.0.1:34120 [Leader] entering Leader state
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:45.425296 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:45.425717 [INFO] consul: New leader elected: Node 85760075-3a4b-8eb6-49fc-124c628165e3
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:45.750899 [INFO] agent: Synced node info
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:46.870973 [DEBUG] dns: request for name bar.node.consul. type A class IN (took 582.68µs) from client 127.0.0.1:49705 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:46.871678 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:46.871783 [INFO] consul: shutting down server
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:46.871832 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:46.926236 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:46.926371 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.008251 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.083304 [INFO] manager: shutting down
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.083334 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.083714 [INFO] agent: consul server down
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.083773 [INFO] agent: shutdown complete
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.083832 [INFO] agent: Stopping DNS server 127.0.0.1:34115 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.083985 [INFO] agent: Stopping DNS server 127.0.0.1:34115 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.084159 [INFO] agent: Stopping HTTP server 127.0.0.1:34116 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.084464 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_A_SuppressTXT - 2019/12/06 06:01:47.084546 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_A_SuppressTXT (3.66s)
=== RUN   TestDNS_EDNS0
=== PAUSE TestDNS_EDNS0
=== RUN   TestDNS_EDNS0_ECS
=== PAUSE TestDNS_EDNS0_ECS
=== RUN   TestDNS_ReverseLookup
=== PAUSE TestDNS_ReverseLookup
=== RUN   TestDNS_ReverseLookup_CustomDomain
=== PAUSE TestDNS_ReverseLookup_CustomDomain
=== RUN   TestDNS_ReverseLookup_IPV6
=== PAUSE TestDNS_ReverseLookup_IPV6
=== RUN   TestDNS_ServiceReverseLookup
--- SKIP: TestDNS_ServiceReverseLookup (0.00s)
    dns_test.go:976: DM-skipped
=== RUN   TestDNS_ServiceReverseLookup_IPV6
=== PAUSE TestDNS_ServiceReverseLookup_IPV6
=== RUN   TestDNS_ServiceReverseLookup_CustomDomain
=== PAUSE TestDNS_ServiceReverseLookup_CustomDomain
=== RUN   TestDNS_SOA_Settings
=== PAUSE TestDNS_SOA_Settings
=== RUN   TestDNS_ServiceReverseLookupNodeAddress
=== PAUSE TestDNS_ServiceReverseLookupNodeAddress
=== RUN   TestDNS_ServiceLookupNoMultiCNAME
--- SKIP: TestDNS_ServiceLookupNoMultiCNAME (0.00s)
    dns_test.go:1204: DM-skipped
=== RUN   TestDNS_ServiceLookupPreferNoCNAME
=== PAUSE TestDNS_ServiceLookupPreferNoCNAME
=== RUN   TestDNS_ServiceLookupMultiAddrNoCNAME
=== PAUSE TestDNS_ServiceLookupMultiAddrNoCNAME
=== RUN   TestDNS_ServiceLookup
=== PAUSE TestDNS_ServiceLookup
=== RUN   TestDNS_ServiceLookupWithInternalServiceAddress
=== PAUSE TestDNS_ServiceLookupWithInternalServiceAddress
=== RUN   TestDNS_ConnectServiceLookup
=== PAUSE TestDNS_ConnectServiceLookup
=== RUN   TestDNS_ExternalServiceLookup
=== PAUSE TestDNS_ExternalServiceLookup
=== RUN   TestDNS_InifiniteRecursion
=== PAUSE TestDNS_InifiniteRecursion
=== RUN   TestDNS_ExternalServiceToConsulCNAMELookup
=== PAUSE TestDNS_ExternalServiceToConsulCNAMELookup
=== RUN   TestDNS_NSRecords
--- SKIP: TestDNS_NSRecords (0.00s)
    dns_test.go:1860: DM-skipped
=== RUN   TestDNS_NSRecords_IPV6
=== PAUSE TestDNS_NSRecords_IPV6
=== RUN   TestDNS_ExternalServiceToConsulCNAMENestedLookup
=== PAUSE TestDNS_ExternalServiceToConsulCNAMENestedLookup
=== RUN   TestDNS_ServiceLookup_ServiceAddress_A
=== PAUSE TestDNS_ServiceLookup_ServiceAddress_A
=== RUN   TestDNS_ServiceLookup_ServiceAddress_CNAME
=== PAUSE TestDNS_ServiceLookup_ServiceAddress_CNAME
=== RUN   TestDNS_ServiceLookup_ServiceAddressIPV6
=== PAUSE TestDNS_ServiceLookup_ServiceAddressIPV6
=== RUN   TestDNS_ServiceLookup_WanAddress
--- SKIP: TestDNS_ServiceLookup_WanAddress (0.00s)
    dns_test.go:2354: DM-skipped
=== RUN   TestDNS_CaseInsensitiveServiceLookup
=== PAUSE TestDNS_CaseInsensitiveServiceLookup
=== RUN   TestDNS_ServiceLookup_TagPeriod
=== PAUSE TestDNS_ServiceLookup_TagPeriod
=== RUN   TestDNS_PreparedQueryNearIPEDNS
=== PAUSE TestDNS_PreparedQueryNearIPEDNS
=== RUN   TestDNS_PreparedQueryNearIP
=== PAUSE TestDNS_PreparedQueryNearIP
=== RUN   TestDNS_ServiceLookup_PreparedQueryNamePeriod
=== PAUSE TestDNS_ServiceLookup_PreparedQueryNamePeriod
=== RUN   TestDNS_ServiceLookup_Dedup
--- SKIP: TestDNS_ServiceLookup_Dedup (0.00s)
    dns_test.go:3008: DM-skipped
=== RUN   TestDNS_ServiceLookup_Dedup_SRV
=== PAUSE TestDNS_ServiceLookup_Dedup_SRV
=== RUN   TestDNS_Recurse
=== PAUSE TestDNS_Recurse
=== RUN   TestDNS_Recurse_Truncation
=== PAUSE TestDNS_Recurse_Truncation
=== RUN   TestDNS_RecursorTimeout
=== PAUSE TestDNS_RecursorTimeout
=== RUN   TestDNS_ServiceLookup_FilterCritical
=== PAUSE TestDNS_ServiceLookup_FilterCritical
=== RUN   TestDNS_ServiceLookup_OnlyFailing
=== PAUSE TestDNS_ServiceLookup_OnlyFailing
=== RUN   TestDNS_ServiceLookup_OnlyPassing
=== PAUSE TestDNS_ServiceLookup_OnlyPassing
=== RUN   TestDNS_ServiceLookup_Randomize
=== PAUSE TestDNS_ServiceLookup_Randomize
=== RUN   TestBinarySearch
=== PAUSE TestBinarySearch
=== RUN   TestDNS_TCP_and_UDP_Truncate
--- SKIP: TestDNS_TCP_and_UDP_Truncate (0.00s)
    dns_test.go:3903: DM-skipped
=== RUN   TestDNS_ServiceLookup_Truncate
=== PAUSE TestDNS_ServiceLookup_Truncate
=== RUN   TestDNS_ServiceLookup_LargeResponses
=== PAUSE TestDNS_ServiceLookup_LargeResponses
=== RUN   TestDNS_ServiceLookup_ARecordLimits
--- SKIP: TestDNS_ServiceLookup_ARecordLimits (0.00s)
    dns_test.go:4342: DM-skipped
=== RUN   TestDNS_ServiceLookup_AnswerLimits
=== PAUSE TestDNS_ServiceLookup_AnswerLimits
=== RUN   TestDNS_ServiceLookup_CNAME
--- SKIP: TestDNS_ServiceLookup_CNAME (0.00s)
    dns_test.go:4487: DM-skipped
=== RUN   TestDNS_NodeLookup_TTL
=== PAUSE TestDNS_NodeLookup_TTL
=== RUN   TestDNS_ServiceLookup_TTL
=== PAUSE TestDNS_ServiceLookup_TTL
=== RUN   TestDNS_PreparedQuery_TTL
=== PAUSE TestDNS_PreparedQuery_TTL
=== RUN   TestDNS_PreparedQuery_Failover
--- SKIP: TestDNS_PreparedQuery_Failover (0.00s)
    dns_test.go:4909: DM-skipped
=== RUN   TestDNS_ServiceLookup_SRV_RFC
=== PAUSE TestDNS_ServiceLookup_SRV_RFC
=== RUN   TestDNS_ServiceLookup_SRV_RFC_TCP_Default
=== PAUSE TestDNS_ServiceLookup_SRV_RFC_TCP_Default
=== RUN   TestDNS_ServiceLookup_FilterACL
=== PAUSE TestDNS_ServiceLookup_FilterACL
=== RUN   TestDNS_ServiceLookup_MetaTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.265507 [WARN] agent: Node name "Node b5c605b1-0b71-a547-32e7-0c166ea93f2d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.266096 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.268404 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b5c605b1-0b71-a547-32e7-0c166ea93f2d Address:127.0.0.1:34126}]
2019/12/06 06:01:47 [INFO]  raft: Node at 127.0.0.1:34126 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.987279 [INFO] serf: EventMemberJoin: Node b5c605b1-0b71-a547-32e7-0c166ea93f2d.dc1 127.0.0.1
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.990618 [INFO] serf: EventMemberJoin: Node b5c605b1-0b71-a547-32e7-0c166ea93f2d 127.0.0.1
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.991265 [INFO] consul: Adding LAN server Node b5c605b1-0b71-a547-32e7-0c166ea93f2d (Addr: tcp/127.0.0.1:34126) (DC: dc1)
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.991293 [INFO] consul: Handled member-join event for server "Node b5c605b1-0b71-a547-32e7-0c166ea93f2d.dc1" in area "wan"
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.992014 [INFO] agent: Started DNS server 127.0.0.1:34121 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.992095 [INFO] agent: Started DNS server 127.0.0.1:34121 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.994697 [INFO] agent: Started HTTP server on 127.0.0.1:34122 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:47.994801 [INFO] agent: started state syncer
2019/12/06 06:01:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:48 [INFO]  raft: Node at 127.0.0.1:34126 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:48 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:48 [INFO]  raft: Node at 127.0.0.1:34126 [Leader] entering Leader state
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:48.492054 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:48.492463 [INFO] consul: New leader elected: Node b5c605b1-0b71-a547-32e7-0c166ea93f2d
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.043090 [INFO] agent: Synced node info
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.346086 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.562369ms) from client 127.0.0.1:45871 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.346320 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.346416 [INFO] consul: shutting down server
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.346470 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.408433 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.475126 [INFO] manager: shutting down
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.650836 [INFO] agent: consul server down
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.651171 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.651364 [INFO] agent: Stopping DNS server 127.0.0.1:34121 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.651605 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.652051 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.652114 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.652163 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.652208 [ERR] consul: failed to transfer leadership in 3 attempts
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.652765 [INFO] agent: Stopping DNS server 127.0.0.1:34121 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.653300 [INFO] agent: Stopping HTTP server 127.0.0.1:34122 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.654063 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_MetaTXT - 2019/12/06 06:01:50.654446 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_MetaTXT (3.55s)
=== RUN   TestDNS_ServiceLookup_SuppressTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:50.721559 [WARN] agent: Node name "Node fcea2ea0-c149-a4c9-57a9-54f32e6b6dd3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:50.722281 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:50.724808 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fcea2ea0-c149-a4c9-57a9-54f32e6b6dd3 Address:127.0.0.1:34132}]
2019/12/06 06:01:51 [INFO]  raft: Node at 127.0.0.1:34132 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:51.503588 [INFO] serf: EventMemberJoin: Node fcea2ea0-c149-a4c9-57a9-54f32e6b6dd3.dc1 127.0.0.1
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:51.510821 [INFO] serf: EventMemberJoin: Node fcea2ea0-c149-a4c9-57a9-54f32e6b6dd3 127.0.0.1
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:51.512286 [INFO] consul: Adding LAN server Node fcea2ea0-c149-a4c9-57a9-54f32e6b6dd3 (Addr: tcp/127.0.0.1:34132) (DC: dc1)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:51.513145 [INFO] consul: Handled member-join event for server "Node fcea2ea0-c149-a4c9-57a9-54f32e6b6dd3.dc1" in area "wan"
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:51.514790 [INFO] agent: Started DNS server 127.0.0.1:34127 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:51.515450 [INFO] agent: Started DNS server 127.0.0.1:34127 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:51.518312 [INFO] agent: Started HTTP server on 127.0.0.1:34128 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:51.518413 [INFO] agent: started state syncer
2019/12/06 06:01:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:51 [INFO]  raft: Node at 127.0.0.1:34132 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:52 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:52 [INFO]  raft: Node at 127.0.0.1:34132 [Leader] entering Leader state
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.042694 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.043138 [INFO] consul: New leader elected: Node fcea2ea0-c149-a4c9-57a9-54f32e6b6dd3
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.367858 [INFO] agent: Synced node info
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.367969 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.689553 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 862.02µs) from client 127.0.0.1:59761 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.689720 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.689817 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.689867 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.833323 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.901801 [INFO] manager: shutting down
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.983457 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.984138 [INFO] agent: consul server down
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.984339 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.984439 [INFO] agent: Stopping DNS server 127.0.0.1:34127 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.984816 [INFO] agent: Stopping DNS server 127.0.0.1:34127 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.985140 [INFO] agent: Stopping HTTP server 127.0.0.1:34128 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.985519 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SuppressTXT - 2019/12/06 06:01:52.985650 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SuppressTXT (2.33s)
=== RUN   TestDNS_AddressLookup
=== PAUSE TestDNS_AddressLookup
=== RUN   TestDNS_AddressLookupIPV6
--- SKIP: TestDNS_AddressLookupIPV6 (0.00s)
    dns_test.go:5350: DM-skipped
=== RUN   TestDNS_NonExistingLookup
=== PAUSE TestDNS_NonExistingLookup
=== RUN   TestDNS_NonExistingLookupEmptyAorAAAA
=== PAUSE TestDNS_NonExistingLookupEmptyAorAAAA
=== RUN   TestDNS_AltDomains_Service
=== PAUSE TestDNS_AltDomains_Service
=== RUN   TestDNS_AltDomains_SOA
=== PAUSE TestDNS_AltDomains_SOA
=== RUN   TestDNS_AltDomains_Overlap
=== PAUSE TestDNS_AltDomains_Overlap
=== RUN   TestDNS_PreparedQuery_AllowStale
=== PAUSE TestDNS_PreparedQuery_AllowStale
=== RUN   TestDNS_InvalidQueries
=== PAUSE TestDNS_InvalidQueries
=== RUN   TestDNS_PreparedQuery_AgentSource
=== PAUSE TestDNS_PreparedQuery_AgentSource
=== RUN   TestDNS_trimUDPResponse_NoTrim
=== PAUSE TestDNS_trimUDPResponse_NoTrim
=== RUN   TestDNS_trimUDPResponse_TrimLimit
=== PAUSE TestDNS_trimUDPResponse_TrimLimit
=== RUN   TestDNS_trimUDPResponse_TrimSize
=== PAUSE TestDNS_trimUDPResponse_TrimSize
=== RUN   TestDNS_trimUDPResponse_TrimSizeEDNS
=== PAUSE TestDNS_trimUDPResponse_TrimSizeEDNS
=== RUN   TestDNS_syncExtra
=== PAUSE TestDNS_syncExtra
=== RUN   TestDNS_Compression_trimUDPResponse
=== PAUSE TestDNS_Compression_trimUDPResponse
=== RUN   TestDNS_Compression_Query
=== PAUSE TestDNS_Compression_Query
=== RUN   TestDNS_Compression_ReverseLookup
=== PAUSE TestDNS_Compression_ReverseLookup
=== RUN   TestDNS_Compression_Recurse
=== PAUSE TestDNS_Compression_Recurse
=== RUN   TestDNSInvalidRegex
=== RUN   TestDNSInvalidRegex/Valid_Hostname
=== RUN   TestDNSInvalidRegex/Valid_Hostname#01
=== RUN   TestDNSInvalidRegex/Invalid_Hostname_with_special_chars
=== RUN   TestDNSInvalidRegex/Invalid_Hostname_with_special_chars_in_the_end
=== RUN   TestDNSInvalidRegex/Whitespace
=== RUN   TestDNSInvalidRegex/Only_special_chars
--- PASS: TestDNSInvalidRegex (0.00s)
    --- PASS: TestDNSInvalidRegex/Valid_Hostname (0.00s)
    --- PASS: TestDNSInvalidRegex/Valid_Hostname#01 (0.00s)
    --- PASS: TestDNSInvalidRegex/Invalid_Hostname_with_special_chars (0.00s)
    --- PASS: TestDNSInvalidRegex/Invalid_Hostname_with_special_chars_in_the_end (0.00s)
    --- PASS: TestDNSInvalidRegex/Whitespace (0.00s)
    --- PASS: TestDNSInvalidRegex/Only_special_chars (0.00s)
=== RUN   TestDNS_formatNodeRecord
--- PASS: TestDNS_formatNodeRecord (0.00s)
=== RUN   TestDNS_ConfigReload
=== PAUSE TestDNS_ConfigReload
=== RUN   TestDNS_ReloadConfig_DuringQuery
=== PAUSE TestDNS_ReloadConfig_DuringQuery
=== RUN   TestEventFire
=== PAUSE TestEventFire
=== RUN   TestEventFire_token
=== PAUSE TestEventFire_token
=== RUN   TestEventList
=== PAUSE TestEventList
=== RUN   TestEventList_Filter
=== PAUSE TestEventList_Filter
=== RUN   TestEventList_ACLFilter
=== PAUSE TestEventList_ACLFilter
=== RUN   TestEventList_Blocking
=== PAUSE TestEventList_Blocking
=== RUN   TestEventList_EventBufOrder
=== PAUSE TestEventList_EventBufOrder
=== RUN   TestUUIDToUint64
=== PAUSE TestUUIDToUint64
=== RUN   TestHealthChecksInState
--- SKIP: TestHealthChecksInState (0.00s)
    health_endpoint_test.go:23: DM-skipped
=== RUN   TestHealthChecksInState_NodeMetaFilter
=== PAUSE TestHealthChecksInState_NodeMetaFilter
=== RUN   TestHealthChecksInState_Filter
=== PAUSE TestHealthChecksInState_Filter
=== RUN   TestHealthChecksInState_DistanceSort
=== PAUSE TestHealthChecksInState_DistanceSort
=== RUN   TestHealthNodeChecks
=== PAUSE TestHealthNodeChecks
=== RUN   TestHealthNodeChecks_Filtering
=== PAUSE TestHealthNodeChecks_Filtering
=== RUN   TestHealthServiceChecks
=== PAUSE TestHealthServiceChecks
=== RUN   TestHealthServiceChecks_NodeMetaFilter
=== PAUSE TestHealthServiceChecks_NodeMetaFilter
=== RUN   TestHealthServiceChecks_Filtering
=== PAUSE TestHealthServiceChecks_Filtering
=== RUN   TestHealthServiceChecks_DistanceSort
=== PAUSE TestHealthServiceChecks_DistanceSort
=== RUN   TestHealthServiceNodes
=== PAUSE TestHealthServiceNodes
=== RUN   TestHealthServiceNodes_NodeMetaFilter
=== PAUSE TestHealthServiceNodes_NodeMetaFilter
=== RUN   TestHealthServiceNodes_Filter
--- SKIP: TestHealthServiceNodes_Filter (0.00s)
    health_endpoint_test.go:737: DM-skipped
=== RUN   TestHealthServiceNodes_DistanceSort
=== PAUSE TestHealthServiceNodes_DistanceSort
=== RUN   TestHealthServiceNodes_PassingFilter
--- SKIP: TestHealthServiceNodes_PassingFilter (0.00s)
    health_endpoint_test.go:879: DM-skipped
=== RUN   TestHealthServiceNodes_WanTranslation
=== PAUSE TestHealthServiceNodes_WanTranslation
=== RUN   TestHealthConnectServiceNodes
=== PAUSE TestHealthConnectServiceNodes
=== RUN   TestHealthConnectServiceNodes_Filter
=== PAUSE TestHealthConnectServiceNodes_Filter
=== RUN   TestHealthConnectServiceNodes_PassingFilter
=== PAUSE TestHealthConnectServiceNodes_PassingFilter
=== RUN   TestFilterNonPassing
=== PAUSE TestFilterNonPassing
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.068020 [WARN] agent: Node name "Node 3d434417-75ce-e300-2e9e-c74634b48698" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.068583 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.071023 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3d434417-75ce-e300-2e9e-c74634b48698 Address:127.0.0.1:34138}]
2019/12/06 06:01:53 [INFO]  raft: Node at 127.0.0.1:34138 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.838210 [INFO] serf: EventMemberJoin: Node 3d434417-75ce-e300-2e9e-c74634b48698.dc1 127.0.0.1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.845154 [INFO] serf: EventMemberJoin: Node 3d434417-75ce-e300-2e9e-c74634b48698 127.0.0.1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.845943 [INFO] consul: Handled member-join event for server "Node 3d434417-75ce-e300-2e9e-c74634b48698.dc1" in area "wan"
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.846265 [INFO] consul: Adding LAN server Node 3d434417-75ce-e300-2e9e-c74634b48698 (Addr: tcp/127.0.0.1:34138) (DC: dc1)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.846405 [INFO] agent: Started DNS server 127.0.0.1:34133 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.847049 [INFO] agent: Started DNS server 127.0.0.1:34133 (udp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.849571 [INFO] agent: Started HTTP server on 127.0.0.1:34134 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:53.849707 [INFO] agent: started state syncer
2019/12/06 06:01:53 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:53 [INFO]  raft: Node at 127.0.0.1:34138 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:54 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:54 [INFO]  raft: Node at 127.0.0.1:34138 [Leader] entering Leader state
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.409109 [INFO] consul: cluster leadership acquired
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.409579 [INFO] consul: New leader elected: Node 3d434417-75ce-e300-2e9e-c74634b48698
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.555706 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.597403 [INFO] acl: initializing acls
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.701028 [INFO] acl: initializing acls
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.842468 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.842551 [WARN] consul: Configuring a non-UUID master token is deprecated
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.992387 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:54.992470 [WARN] consul: Configuring a non-UUID master token is deprecated
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.368294 [INFO] consul: Bootstrapped ACL master token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.368465 [INFO] consul: Bootstrapped ACL master token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.526339 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.526497 [DEBUG] acl: transitioning out of legacy ACL mode
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.528457 [INFO] serf: EventMemberUpdate: Node 3d434417-75ce-e300-2e9e-c74634b48698
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.533534 [INFO] serf: EventMemberUpdate: Node 3d434417-75ce-e300-2e9e-c74634b48698.dc1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.684631 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.685786 [INFO] serf: EventMemberUpdate: Node 3d434417-75ce-e300-2e9e-c74634b48698
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:55.686877 [INFO] serf: EventMemberUpdate: Node 3d434417-75ce-e300-2e9e-c74634b48698.dc1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:57.800998 [INFO] agent: Synced node info
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:57.801121 [DEBUG] agent: Node info in sync
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:57.801476 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:57.801891 [DEBUG] consul: Skipping self join check for "Node 3d434417-75ce-e300-2e9e-c74634b48698" since the cluster is too small
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:57.802038 [INFO] consul: member 'Node 3d434417-75ce-e300-2e9e-c74634b48698' joined, marking health alive
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.027550 [DEBUG] consul: Skipping self join check for "Node 3d434417-75ce-e300-2e9e-c74634b48698" since the cluster is too small
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.028016 [DEBUG] consul: Skipping self join check for "Node 3d434417-75ce-e300-2e9e-c74634b48698" since the cluster is too small
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.028411 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.054965 [DEBUG] http: Request GET /v1/query (21.806839ms) from=127.0.0.1:45574
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.062129 [ERR] http: Request PUT /v1/query, error: method PUT not allowed from=127.0.0.1:45576
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.065710 [DEBUG] http: Request PUT /v1/query (3.631084ms) from=127.0.0.1:45576
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.090230 [DEBUG] http: Request POST /v1/query (12.359286ms) from=127.0.0.1:45578
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.109790 [ERR] http: Request DELETE /v1/query, error: method DELETE not allowed from=127.0.0.1:45580
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.123420 [DEBUG] http: Request DELETE /v1/query (13.69765ms) from=127.0.0.1:45580
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.127645 [ERR] http: Request HEAD /v1/query, error: method HEAD not allowed from=127.0.0.1:45582
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.127903 [DEBUG] http: Request HEAD /v1/query (347.341µs) from=127.0.0.1:45582
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.130409 [DEBUG] http: Request OPTIONS /v1/query (19.001µs) from=127.0.0.1:45582
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.132672 [ERR] http: Request GET /v1/query/, error: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:45582
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.134268 [DEBUG] http: Request GET /v1/query/ (1.991379ms) from=127.0.0.1:45582
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.145381 [ERR] http: Request PUT /v1/query/, error: Prepared Query lookup failed: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:45584
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.146133 [DEBUG] http: Request PUT /v1/query/ (1.294363ms) from=127.0.0.1:45584
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.149689 [ERR] http: Request POST /v1/query/, error: method POST not allowed from=127.0.0.1:45586
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.150293 [DEBUG] http: Request POST /v1/query/ (592.68µs) from=127.0.0.1:45586
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.155574 [ERR] http: Request DELETE /v1/query/, error: Prepared Query lookup failed: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:45588
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.156205 [DEBUG] http: Request DELETE /v1/query/ (1.14836ms) from=127.0.0.1:45588
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.165764 [ERR] http: Request HEAD /v1/query/, error: method HEAD not allowed from=127.0.0.1:45590
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.165945 [DEBUG] http: Request HEAD /v1/query/ (201.004µs) from=127.0.0.1:45590
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.173695 [DEBUG] http: Request OPTIONS /v1/query/ (1.481701ms) from=127.0.0.1:45590
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.178459 [DEBUG] http: Request GET /v1/query/xxx/execute (1.073692ms) from=127.0.0.1:45592
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.182306 [ERR] http: Request PUT /v1/query/xxx/execute, error: method PUT not allowed from=127.0.0.1:45594
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.183181 [DEBUG] http: Request PUT /v1/query/xxx/execute (891.354µs) from=127.0.0.1:45594
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.188393 [ERR] http: Request POST /v1/query/xxx/execute, error: method POST not allowed from=127.0.0.1:45596
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.190325 [DEBUG] http: Request POST /v1/query/xxx/execute (1.920711ms) from=127.0.0.1:45596
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.193462 [ERR] http: Request DELETE /v1/query/xxx/execute, error: method DELETE not allowed from=127.0.0.1:45598
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.194262 [DEBUG] http: Request DELETE /v1/query/xxx/execute (713.016µs) from=127.0.0.1:45598
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.198111 [ERR] http: Request HEAD /v1/query/xxx/execute, error: method HEAD not allowed from=127.0.0.1:45600
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.198262 [DEBUG] http: Request HEAD /v1/query/xxx/execute (163.337µs) from=127.0.0.1:45600
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.201680 [DEBUG] http: Request OPTIONS /v1/query/xxx/execute (758.684µs) from=127.0.0.1:45600
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.206721 [DEBUG] http: Request GET /v1/query/xxx/explain (927.355µs) from=127.0.0.1:45602
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.209977 [ERR] http: Request PUT /v1/query/xxx/explain, error: method PUT not allowed from=127.0.0.1:45604
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.210934 [DEBUG] http: Request PUT /v1/query/xxx/explain (964.356µs) from=127.0.0.1:45604
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.215143 [ERR] http: Request POST /v1/query/xxx/explain, error: method POST not allowed from=127.0.0.1:45606
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.216811 [DEBUG] http: Request POST /v1/query/xxx/explain (1.665039ms) from=127.0.0.1:45606
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.221339 [ERR] http: Request DELETE /v1/query/xxx/explain, error: method DELETE not allowed from=127.0.0.1:45608
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.221924 [DEBUG] http: Request DELETE /v1/query/xxx/explain (609.681µs) from=127.0.0.1:45608
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.225730 [ERR] http: Request HEAD /v1/query/xxx/explain, error: method HEAD not allowed from=127.0.0.1:45610
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.225875 [DEBUG] http: Request HEAD /v1/query/xxx/explain (169.337µs) from=127.0.0.1:45610
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.227814 [DEBUG] http: Request OPTIONS /v1/query/xxx/explain (441.01µs) from=127.0.0.1:45610
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.237571 [DEBUG] http: Request GET /v1/agent/services (6.598819ms) from=127.0.0.1:45612
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.246139 [ERR] http: Request PUT /v1/agent/services, error: method PUT not allowed from=127.0.0.1:45614
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.248232 [DEBUG] http: Request PUT /v1/agent/services (2.090715ms) from=127.0.0.1:45614
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.256860 [ERR] http: Request POST /v1/agent/services, error: method POST not allowed from=127.0.0.1:45616
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.257434 [DEBUG] http: Request POST /v1/agent/services (586.68µs) from=127.0.0.1:45616
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.260960 [ERR] http: Request DELETE /v1/agent/services, error: method DELETE not allowed from=127.0.0.1:45618
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.261623 [DEBUG] http: Request DELETE /v1/agent/services (674.683µs) from=127.0.0.1:45618
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.265370 [ERR] http: Request HEAD /v1/agent/services, error: method HEAD not allowed from=127.0.0.1:45620
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.265696 [DEBUG] http: Request HEAD /v1/agent/services (345.008µs) from=127.0.0.1:45620
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.267453 [DEBUG] http: Request OPTIONS /v1/agent/services (19.334µs) from=127.0.0.1:45620
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.270155 [ERR] http: Request GET /v1/agent/health/service/name/, error: Bad request: Missing service Name from=127.0.0.1:45620
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.274316 [DEBUG] http: Request GET /v1/agent/health/service/name/ (4.615107ms) from=127.0.0.1:45620
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.277566 [ERR] http: Request PUT /v1/agent/health/service/name/, error: method PUT not allowed from=127.0.0.1:45622
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.278967 [DEBUG] http: Request PUT /v1/agent/health/service/name/ (1.075692ms) from=127.0.0.1:45622
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.283230 [ERR] http: Request POST /v1/agent/health/service/name/, error: method POST not allowed from=127.0.0.1:45624
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.283826 [DEBUG] http: Request POST /v1/agent/health/service/name/ (592.681µs) from=127.0.0.1:45624
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.286970 [ERR] http: Request DELETE /v1/agent/health/service/name/, error: method DELETE not allowed from=127.0.0.1:45626
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.287527 [DEBUG] http: Request DELETE /v1/agent/health/service/name/ (565.68µs) from=127.0.0.1:45626
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.300174 [ERR] http: Request HEAD /v1/agent/health/service/name/, error: method HEAD not allowed from=127.0.0.1:45628
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.300339 [DEBUG] http: Request HEAD /v1/agent/health/service/name/ (178.671µs) from=127.0.0.1:45628
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.301825 [DEBUG] http: Request OPTIONS /v1/agent/health/service/name/ (16.334µs) from=127.0.0.1:45628
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.303638 [DEBUG] http: Request GET /v1/status/leader (450.677µs) from=127.0.0.1:45628
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.306522 [ERR] http: Request PUT /v1/status/leader, error: method PUT not allowed from=127.0.0.1:45630
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.307063 [DEBUG] http: Request PUT /v1/status/leader (549.68µs) from=127.0.0.1:45630
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.310031 [ERR] http: Request POST /v1/status/leader, error: method POST not allowed from=127.0.0.1:45632
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.310595 [DEBUG] http: Request POST /v1/status/leader (569.68µs) from=127.0.0.1:45632
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.313441 [ERR] http: Request DELETE /v1/status/leader, error: method DELETE not allowed from=127.0.0.1:45634
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.314406 [DEBUG] http: Request DELETE /v1/status/leader (927.688µs) from=127.0.0.1:45634
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.317715 [ERR] http: Request HEAD /v1/status/leader, error: method HEAD not allowed from=127.0.0.1:45636
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.317878 [DEBUG] http: Request HEAD /v1/status/leader (229.005µs) from=127.0.0.1:45636
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.319776 [DEBUG] http: Request OPTIONS /v1/status/leader (16.333µs) from=127.0.0.1:45636
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.322475 [DEBUG] http: Request GET /v1/catalog/datacenters (940.021µs) from=127.0.0.1:45636
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.326351 [ERR] http: Request PUT /v1/catalog/datacenters, error: method PUT not allowed from=127.0.0.1:45638
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.326865 [DEBUG] http: Request PUT /v1/catalog/datacenters (523.012µs) from=127.0.0.1:45638
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.329531 [ERR] http: Request POST /v1/catalog/datacenters, error: method POST not allowed from=127.0.0.1:45640
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.330009 [DEBUG] http: Request POST /v1/catalog/datacenters (486.344µs) from=127.0.0.1:45640
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.332843 [ERR] http: Request DELETE /v1/catalog/datacenters, error: method DELETE not allowed from=127.0.0.1:45642
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.333551 [DEBUG] http: Request DELETE /v1/catalog/datacenters (735.35µs) from=127.0.0.1:45642
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.336663 [ERR] http: Request HEAD /v1/catalog/datacenters, error: method HEAD not allowed from=127.0.0.1:45644
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.336818 [DEBUG] http: Request HEAD /v1/catalog/datacenters (172.338µs) from=127.0.0.1:45644
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.338372 [DEBUG] http: Request OPTIONS /v1/catalog/datacenters (15µs) from=127.0.0.1:45644
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.340018 [ERR] http: Request GET /v1/connect/intentions/match, error: required query parameter 'by' not set from=127.0.0.1:45644
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.340554 [DEBUG] http: Request GET /v1/connect/intentions/match (552.679µs) from=127.0.0.1:45644
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.343818 [ERR] http: Request PUT /v1/connect/intentions/match, error: method PUT not allowed from=127.0.0.1:45646
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.344457 [DEBUG] http: Request PUT /v1/connect/intentions/match (641.348µs) from=127.0.0.1:45646
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.347649 [ERR] http: Request POST /v1/connect/intentions/match, error: method POST not allowed from=127.0.0.1:45648
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.348216 [DEBUG] http: Request POST /v1/connect/intentions/match (569.013µs) from=127.0.0.1:45648
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.351456 [ERR] http: Request DELETE /v1/connect/intentions/match, error: method DELETE not allowed from=127.0.0.1:45650
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.352008 [DEBUG] http: Request DELETE /v1/connect/intentions/match (562.347µs) from=127.0.0.1:45650
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.354857 [ERR] http: Request HEAD /v1/connect/intentions/match, error: method HEAD not allowed from=127.0.0.1:45652
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.355001 [DEBUG] http: Request HEAD /v1/connect/intentions/match (163.004µs) from=127.0.0.1:45652
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.356548 [DEBUG] http: Request OPTIONS /v1/connect/intentions/match (14.334µs) from=127.0.0.1:45652
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.358016 [ERR] http: Request GET /v1/acl/destroy/, error: method GET not allowed from=127.0.0.1:45652
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.358661 [DEBUG] http: Request GET /v1/acl/destroy/ (643.681µs) from=127.0.0.1:45652
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.362222 [DEBUG] http: Request PUT /v1/acl/destroy/ (423.01µs) from=127.0.0.1:45654
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.365387 [ERR] http: Request POST /v1/acl/destroy/, error: method POST not allowed from=127.0.0.1:45656
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.366084 [DEBUG] http: Request POST /v1/acl/destroy/ (706.35µs) from=127.0.0.1:45656
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.369026 [ERR] http: Request DELETE /v1/acl/destroy/, error: method DELETE not allowed from=127.0.0.1:45658
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.369792 [DEBUG] http: Request DELETE /v1/acl/destroy/ (775.351µs) from=127.0.0.1:45658
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.373464 [ERR] http: Request HEAD /v1/acl/destroy/, error: method HEAD not allowed from=127.0.0.1:45660
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.373808 [DEBUG] http: Request HEAD /v1/acl/destroy/ (359.342µs) from=127.0.0.1:45660
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.376782 [DEBUG] http: Request OPTIONS /v1/acl/destroy/ (18.334µs) from=127.0.0.1:45660
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.379016 [ERR] http: Request GET /v1/acl/list, error: Permission denied from=127.0.0.1:45660
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.379731 [DEBUG] http: Request GET /v1/acl/list (1.15736ms) from=127.0.0.1:45660
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.383120 [ERR] http: Request PUT /v1/acl/list, error: method PUT not allowed from=127.0.0.1:45662
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.383701 [DEBUG] http: Request PUT /v1/acl/list (543.345µs) from=127.0.0.1:45662
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.387439 [ERR] http: Request POST /v1/acl/list, error: method POST not allowed from=127.0.0.1:45664
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.388050 [DEBUG] http: Request POST /v1/acl/list (632.015µs) from=127.0.0.1:45664
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.391820 [ERR] http: Request DELETE /v1/acl/list, error: method DELETE not allowed from=127.0.0.1:45666
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.392369 [DEBUG] http: Request DELETE /v1/acl/list (554.013µs) from=127.0.0.1:45666
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.395722 [ERR] http: Request HEAD /v1/acl/list, error: method HEAD not allowed from=127.0.0.1:45668
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.395864 [DEBUG] http: Request HEAD /v1/acl/list (172.67µs) from=127.0.0.1:45668
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.397736 [DEBUG] http: Request OPTIONS /v1/acl/list (13µs) from=127.0.0.1:45668
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.399369 [ERR] http: Request GET /v1/acl/rules/translate, error: method GET not allowed from=127.0.0.1:45668
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.399956 [DEBUG] http: Request GET /v1/acl/rules/translate (630.347µs) from=127.0.0.1:45668
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.402997 [ERR] http: Request PUT /v1/acl/rules/translate, error: method PUT not allowed from=127.0.0.1:45670
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.403633 [DEBUG] http: Request PUT /v1/acl/rules/translate (646.348µs) from=127.0.0.1:45670
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.407356 [ERR] http: Request POST /v1/acl/rules/translate, error: Permission denied from=127.0.0.1:45672
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.407881 [DEBUG] http: Request POST /v1/acl/rules/translate (673.682µs) from=127.0.0.1:45672
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.411338 [ERR] http: Request DELETE /v1/acl/rules/translate, error: method DELETE not allowed from=127.0.0.1:45674
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.411879 [DEBUG] http: Request DELETE /v1/acl/rules/translate (544.013µs) from=127.0.0.1:45674
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.415911 [ERR] http: Request HEAD /v1/acl/rules/translate, error: method HEAD not allowed from=127.0.0.1:45676
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.416090 [DEBUG] http: Request HEAD /v1/acl/rules/translate (198.671µs) from=127.0.0.1:45676
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.418040 [DEBUG] http: Request OPTIONS /v1/acl/rules/translate (16.334µs) from=127.0.0.1:45676
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.419539 [ERR] http: Request GET /v1/agent/metrics, error: Permission denied from=127.0.0.1:45676
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.420105 [DEBUG] http: Request GET /v1/agent/metrics (688.016µs) from=127.0.0.1:45676
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.422985 [ERR] http: Request PUT /v1/agent/metrics, error: method PUT not allowed from=127.0.0.1:45678
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.423521 [DEBUG] http: Request PUT /v1/agent/metrics (556.68µs) from=127.0.0.1:45678
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.426622 [ERR] http: Request POST /v1/agent/metrics, error: method POST not allowed from=127.0.0.1:45680
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.427425 [DEBUG] http: Request POST /v1/agent/metrics (804.352µs) from=127.0.0.1:45680
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.430441 [ERR] http: Request DELETE /v1/agent/metrics, error: method DELETE not allowed from=127.0.0.1:45682
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.431142 [DEBUG] http: Request DELETE /v1/agent/metrics (686.016µs) from=127.0.0.1:45682
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.436216 [ERR] http: Request HEAD /v1/agent/metrics, error: method HEAD not allowed from=127.0.0.1:45684
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.436687 [DEBUG] http: Request HEAD /v1/agent/metrics (519.678µs) from=127.0.0.1:45684
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.438659 [DEBUG] http: Request OPTIONS /v1/agent/metrics (17.667µs) from=127.0.0.1:45684
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.440897 [DEBUG] http: Request GET /v1/agent/service/ (472.344µs) from=127.0.0.1:45684
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.443947 [ERR] http: Request PUT /v1/agent/service/, error: method PUT not allowed from=127.0.0.1:45686
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.444678 [DEBUG] http: Request PUT /v1/agent/service/ (737.017µs) from=127.0.0.1:45686
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.447791 [ERR] http: Request POST /v1/agent/service/, error: method POST not allowed from=127.0.0.1:45688
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.448329 [DEBUG] http: Request POST /v1/agent/service/ (541.345µs) from=127.0.0.1:45688
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.451262 [ERR] http: Request DELETE /v1/agent/service/, error: method DELETE not allowed from=127.0.0.1:45690
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.451826 [DEBUG] http: Request DELETE /v1/agent/service/ (573.68µs) from=127.0.0.1:45690
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.454596 [ERR] http: Request HEAD /v1/agent/service/, error: method HEAD not allowed from=127.0.0.1:45692
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.454734 [DEBUG] http: Request HEAD /v1/agent/service/ (160.004µs) from=127.0.0.1:45692
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.456289 [DEBUG] http: Request OPTIONS /v1/agent/service/ (13.667µs) from=127.0.0.1:45692
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.457841 [ERR] http: Request GET /v1/agent/service/register, error: method GET not allowed from=127.0.0.1:45692
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.458602 [DEBUG] http: Request GET /v1/agent/service/register (759.018µs) from=127.0.0.1:45692
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.462999 [DEBUG] http: Request PUT /v1/agent/service/register (680.683µs) from=127.0.0.1:45694
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.466267 [ERR] http: Request POST /v1/agent/service/register, error: method POST not allowed from=127.0.0.1:45696
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.466816 [DEBUG] http: Request POST /v1/agent/service/register (552.013µs) from=127.0.0.1:45696
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.469820 [ERR] http: Request DELETE /v1/agent/service/register, error: method DELETE not allowed from=127.0.0.1:45698
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.470577 [DEBUG] http: Request DELETE /v1/agent/service/register (761.684µs) from=127.0.0.1:45698
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.473668 [ERR] http: Request HEAD /v1/agent/service/register, error: method HEAD not allowed from=127.0.0.1:45700
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.473818 [DEBUG] http: Request HEAD /v1/agent/service/register (163.337µs) from=127.0.0.1:45700
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.475677 [DEBUG] http: Request OPTIONS /v1/agent/service/register (18µs) from=127.0.0.1:45700
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.477494 [ERR] http: Request GET /v1/snapshot, error: Permission denied from=127.0.0.1:45700
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.478067 [DEBUG] http: Request GET /v1/snapshot (852.686µs) from=127.0.0.1:45700
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.481353 [ERR] http: Request PUT /v1/snapshot, error: Permission denied from=127.0.0.1:45702
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.481910 [DEBUG] http: Request PUT /v1/snapshot (847.019µs) from=127.0.0.1:45702
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.484721 [ERR] http: Request POST /v1/snapshot, error: method POST not allowed from=127.0.0.1:45704
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.485415 [DEBUG] http: Request POST /v1/snapshot (702.016µs) from=127.0.0.1:45704
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.488966 [ERR] http: Request DELETE /v1/snapshot, error: method DELETE not allowed from=127.0.0.1:45706
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.489736 [DEBUG] http: Request DELETE /v1/snapshot (819.352µs) from=127.0.0.1:45706
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.493057 [ERR] http: Request HEAD /v1/snapshot, error: method HEAD not allowed from=127.0.0.1:45708
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.493201 [DEBUG] http: Request HEAD /v1/snapshot (164.004µs) from=127.0.0.1:45708
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.494603 [DEBUG] http: Request OPTIONS /v1/snapshot (15.667µs) from=127.0.0.1:45708
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.495939 [ERR] http: Request GET /v1/acl/logout, error: method GET not allowed from=127.0.0.1:45708
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.496448 [DEBUG] http: Request GET /v1/acl/logout (523.012µs) from=127.0.0.1:45708
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.499590 [ERR] http: Request PUT /v1/acl/logout, error: method PUT not allowed from=127.0.0.1:45710
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.500273 [DEBUG] http: Request PUT /v1/acl/logout (688.349µs) from=127.0.0.1:45710
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.503425 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:45712
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.504112 [DEBUG] http: Request POST /v1/acl/logout (685.349µs) from=127.0.0.1:45712
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.508000 [ERR] http: Request DELETE /v1/acl/logout, error: method DELETE not allowed from=127.0.0.1:45714
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.508774 [DEBUG] http: Request DELETE /v1/acl/logout (766.018µs) from=127.0.0.1:45714
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.513232 [ERR] http: Request HEAD /v1/acl/logout, error: method HEAD not allowed from=127.0.0.1:45716
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.513714 [DEBUG] http: Request HEAD /v1/acl/logout (500.344µs) from=127.0.0.1:45716
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/logout
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.516329 [DEBUG] http: Request OPTIONS /v1/acl/logout (80.335µs) from=127.0.0.1:45716
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.518634 [ERR] http: Request GET /v1/acl/create, error: method GET not allowed from=127.0.0.1:45716
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.519623 [DEBUG] http: Request GET /v1/acl/create (978.689µs) from=127.0.0.1:45716
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.523732 [ERR] http: Request PUT /v1/acl/create, error: Permission denied from=127.0.0.1:45718
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.524494 [DEBUG] http: Request PUT /v1/acl/create (1.271696ms) from=127.0.0.1:45718
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.529040 [ERR] http: Request POST /v1/acl/create, error: method POST not allowed from=127.0.0.1:45720
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.529977 [DEBUG] http: Request POST /v1/acl/create (951.688µs) from=127.0.0.1:45720
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.534387 [ERR] http: Request DELETE /v1/acl/create, error: method DELETE not allowed from=127.0.0.1:45722
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.535255 [DEBUG] http: Request DELETE /v1/acl/create (869.687µs) from=127.0.0.1:45722
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.538728 [ERR] http: Request HEAD /v1/acl/create, error: method HEAD not allowed from=127.0.0.1:45724
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.539085 [DEBUG] http: Request HEAD /v1/acl/create (378.009µs) from=127.0.0.1:45724
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.540830 [DEBUG] http: Request OPTIONS /v1/acl/create (18µs) from=127.0.0.1:45724
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.542719 [ERR] http: Request GET /v1/acl/binding-rule, error: method GET not allowed from=127.0.0.1:45724
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.543357 [DEBUG] http: Request GET /v1/acl/binding-rule (635.348µs) from=127.0.0.1:45724
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.547137 [ERR] http: Request PUT /v1/acl/binding-rule, error: Bad request: BindingRule decoding failed: EOF from=127.0.0.1:45726
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.548092 [DEBUG] http: Request PUT /v1/acl/binding-rule (1.000024ms) from=127.0.0.1:45726
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.552433 [ERR] http: Request POST /v1/acl/binding-rule, error: method POST not allowed from=127.0.0.1:45728
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.553301 [DEBUG] http: Request POST /v1/acl/binding-rule (861.353µs) from=127.0.0.1:45728
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.557054 [ERR] http: Request DELETE /v1/acl/binding-rule, error: method DELETE not allowed from=127.0.0.1:45730
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.557680 [DEBUG] http: Request DELETE /v1/acl/binding-rule (634.348µs) from=127.0.0.1:45730
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.560680 [ERR] http: Request HEAD /v1/acl/binding-rule, error: method HEAD not allowed from=127.0.0.1:45732
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.560834 [DEBUG] http: Request HEAD /v1/acl/binding-rule (175.338µs) from=127.0.0.1:45732
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rule
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.562400 [DEBUG] http: Request OPTIONS /v1/acl/binding-rule (18µs) from=127.0.0.1:45732
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.563852 [ERR] http: Request GET /v1/connect/intentions/check, error: required query parameter 'source' not set from=127.0.0.1:45732
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.564574 [DEBUG] http: Request GET /v1/connect/intentions/check (735.35µs) from=127.0.0.1:45732
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.572030 [ERR] http: Request PUT /v1/connect/intentions/check, error: method PUT not allowed from=127.0.0.1:45734
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.572982 [DEBUG] http: Request PUT /v1/connect/intentions/check (883.687µs) from=127.0.0.1:45734
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.576421 [ERR] http: Request POST /v1/connect/intentions/check, error: method POST not allowed from=127.0.0.1:45736
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.577069 [DEBUG] http: Request POST /v1/connect/intentions/check (654.349µs) from=127.0.0.1:45736
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.582307 [ERR] http: Request DELETE /v1/connect/intentions/check, error: method DELETE not allowed from=127.0.0.1:45738
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.582951 [DEBUG] http: Request DELETE /v1/connect/intentions/check (642.682µs) from=127.0.0.1:45738
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.586254 [ERR] http: Request HEAD /v1/connect/intentions/check, error: method HEAD not allowed from=127.0.0.1:45740
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.586516 [DEBUG] http: Request HEAD /v1/connect/intentions/check (386.009µs) from=127.0.0.1:45740
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.588389 [DEBUG] http: Request OPTIONS /v1/connect/intentions/check (16.667µs) from=127.0.0.1:45740
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.590224 [ERR] http: Request GET /v1/acl/policy/, error: Bad request: Missing policy ID from=127.0.0.1:45740
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.590939 [DEBUG] http: Request GET /v1/acl/policy/ (708.683µs) from=127.0.0.1:45740
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.594326 [ERR] http: Request PUT /v1/acl/policy/, error: Bad request: Policy decoding failed: EOF from=127.0.0.1:45742
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.595096 [DEBUG] http: Request PUT /v1/acl/policy/ (929.021µs) from=127.0.0.1:45742
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.599163 [ERR] http: Request POST /v1/acl/policy/, error: method POST not allowed from=127.0.0.1:45744
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.599908 [DEBUG] http: Request POST /v1/acl/policy/ (740.017µs) from=127.0.0.1:45744
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.603653 [ERR] http: Request DELETE /v1/acl/policy/, error: Bad request: Missing policy ID from=127.0.0.1:45746
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.604306 [DEBUG] http: Request DELETE /v1/acl/policy/ (565.68µs) from=127.0.0.1:45746
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.607733 [ERR] http: Request HEAD /v1/acl/policy/, error: method HEAD not allowed from=127.0.0.1:45748
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.607881 [DEBUG] http: Request HEAD /v1/acl/policy/ (164.337µs) from=127.0.0.1:45748
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.613408 [DEBUG] http: Request OPTIONS /v1/acl/policy/ (21.667µs) from=127.0.0.1:45748
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.616032 [ERR] http: Request GET /v1/acl/binding-rule/, error: Bad request: Missing binding rule ID from=127.0.0.1:45748
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.617230 [DEBUG] http: Request GET /v1/acl/binding-rule/ (1.15736ms) from=127.0.0.1:45748
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.621005 [ERR] http: Request PUT /v1/acl/binding-rule/, error: Bad request: BindingRule decoding failed: EOF from=127.0.0.1:45750
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.621648 [DEBUG] http: Request PUT /v1/acl/binding-rule/ (673.015µs) from=127.0.0.1:45750
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.625456 [ERR] http: Request POST /v1/acl/binding-rule/, error: method POST not allowed from=127.0.0.1:45752
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.626062 [DEBUG] http: Request POST /v1/acl/binding-rule/ (619.681µs) from=127.0.0.1:45752
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.629151 [ERR] http: Request DELETE /v1/acl/binding-rule/, error: Bad request: Missing binding rule ID from=127.0.0.1:45754
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.629817 [DEBUG] http: Request DELETE /v1/acl/binding-rule/ (681.016µs) from=127.0.0.1:45754
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.632994 [ERR] http: Request HEAD /v1/acl/binding-rule/, error: method HEAD not allowed from=127.0.0.1:45756
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.633212 [DEBUG] http: Request HEAD /v1/acl/binding-rule/ (239.005µs) from=127.0.0.1:45756
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rule/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.635224 [DEBUG] http: Request OPTIONS /v1/acl/binding-rule/ (19.334µs) from=127.0.0.1:45756
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.637331 [DEBUG] consul: dropping node "Node 3d434417-75ce-e300-2e9e-c74634b48698" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.638177 [DEBUG] http: Request GET /v1/catalog/nodes (1.382699ms) from=127.0.0.1:45756
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.645194 [ERR] http: Request PUT /v1/catalog/nodes, error: method PUT not allowed from=127.0.0.1:45758
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.646088 [DEBUG] http: Request PUT /v1/catalog/nodes (945.688µs) from=127.0.0.1:45758
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.649346 [ERR] http: Request POST /v1/catalog/nodes, error: method POST not allowed from=127.0.0.1:45760
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.650130 [DEBUG] http: Request POST /v1/catalog/nodes (802.352µs) from=127.0.0.1:45760
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.653188 [ERR] http: Request DELETE /v1/catalog/nodes, error: method DELETE not allowed from=127.0.0.1:45762
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.653995 [DEBUG] http: Request DELETE /v1/catalog/nodes (808.019µs) from=127.0.0.1:45762
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.657838 [ERR] http: Request HEAD /v1/catalog/nodes, error: method HEAD not allowed from=127.0.0.1:45764
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.658052 [DEBUG] http: Request HEAD /v1/catalog/nodes (288.34µs) from=127.0.0.1:45764
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.660134 [DEBUG] http: Request OPTIONS /v1/catalog/nodes (18µs) from=127.0.0.1:45764
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.662574 [DEBUG] http: Request GET /v1/catalog/node/ (647.681µs) from=127.0.0.1:45764
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.665615 [ERR] http: Request PUT /v1/catalog/node/, error: method PUT not allowed from=127.0.0.1:45766
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.666419 [DEBUG] http: Request PUT /v1/catalog/node/ (804.686µs) from=127.0.0.1:45766
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.669902 [ERR] http: Request POST /v1/catalog/node/, error: method POST not allowed from=127.0.0.1:45768
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.670505 [DEBUG] http: Request POST /v1/catalog/node/ (611.681µs) from=127.0.0.1:45768
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.673698 [ERR] http: Request DELETE /v1/catalog/node/, error: method DELETE not allowed from=127.0.0.1:45770
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.674588 [DEBUG] http: Request DELETE /v1/catalog/node/ (887.353µs) from=127.0.0.1:45770
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.678430 [ERR] http: Request HEAD /v1/catalog/node/, error: method HEAD not allowed from=127.0.0.1:45772
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.678654 [DEBUG] http: Request HEAD /v1/catalog/node/ (296.34µs) from=127.0.0.1:45772
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.680858 [DEBUG] http: Request OPTIONS /v1/catalog/node/ (16µs) from=127.0.0.1:45772
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.683368 [ERR] http: Request GET /v1/acl/tokens, error: Permission denied from=127.0.0.1:45772
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.684419 [DEBUG] http: Request GET /v1/acl/tokens (1.58037ms) from=127.0.0.1:45772
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.687798 [ERR] http: Request PUT /v1/acl/tokens, error: method PUT not allowed from=127.0.0.1:45774
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.688611 [DEBUG] http: Request PUT /v1/acl/tokens (812.685µs) from=127.0.0.1:45774
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.692386 [ERR] http: Request POST /v1/acl/tokens, error: method POST not allowed from=127.0.0.1:45776
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.693131 [DEBUG] http: Request POST /v1/acl/tokens (793.685µs) from=127.0.0.1:45776
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.696921 [ERR] http: Request DELETE /v1/acl/tokens, error: method DELETE not allowed from=127.0.0.1:45778
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.697778 [DEBUG] http: Request DELETE /v1/acl/tokens (854.02µs) from=127.0.0.1:45778
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.701741 [ERR] http: Request HEAD /v1/acl/tokens, error: method HEAD not allowed from=127.0.0.1:45780
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.702019 [DEBUG] http: Request HEAD /v1/acl/tokens (313.341µs) from=127.0.0.1:45780
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.703895 [DEBUG] http: Request OPTIONS /v1/acl/tokens (30.334µs) from=127.0.0.1:45780
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.706264 [ERR] http: Request GET /v1/agent/maintenance, error: method GET not allowed from=127.0.0.1:45780
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.707009 [DEBUG] http: Request GET /v1/agent/maintenance (746.351µs) from=127.0.0.1:45780
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.711260 [DEBUG] http: Request PUT /v1/agent/maintenance (537.346µs) from=127.0.0.1:45782
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.715391 [ERR] http: Request POST /v1/agent/maintenance, error: method POST not allowed from=127.0.0.1:45784
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.716150 [DEBUG] http: Request POST /v1/agent/maintenance (754.351µs) from=127.0.0.1:45784
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.722943 [ERR] http: Request DELETE /v1/agent/maintenance, error: method DELETE not allowed from=127.0.0.1:45786
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.723705 [DEBUG] http: Request DELETE /v1/agent/maintenance (754.351µs) from=127.0.0.1:45786
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.727576 [ERR] http: Request HEAD /v1/agent/maintenance, error: method HEAD not allowed from=127.0.0.1:45788
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.727861 [DEBUG] http: Request HEAD /v1/agent/maintenance (304.007µs) from=127.0.0.1:45788
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.729725 [DEBUG] http: Request OPTIONS /v1/agent/maintenance (17µs) from=127.0.0.1:45788
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.731717 [ERR] http: Request GET /v1/agent/check/register, error: method GET not allowed from=127.0.0.1:45788
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.732466 [DEBUG] http: Request GET /v1/agent/check/register (747.684µs) from=127.0.0.1:45788
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.736887 [DEBUG] http: Request PUT /v1/agent/check/register (535.345µs) from=127.0.0.1:45790
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.740835 [ERR] http: Request POST /v1/agent/check/register, error: method POST not allowed from=127.0.0.1:45792
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.741950 [DEBUG] http: Request POST /v1/agent/check/register (1.121693ms) from=127.0.0.1:45792
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.745542 [ERR] http: Request DELETE /v1/agent/check/register, error: method DELETE not allowed from=127.0.0.1:45794
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.746416 [DEBUG] http: Request DELETE /v1/agent/check/register (886.354µs) from=127.0.0.1:45794
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.750244 [ERR] http: Request HEAD /v1/agent/check/register, error: method HEAD not allowed from=127.0.0.1:45796
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.750582 [DEBUG] http: Request HEAD /v1/agent/check/register (369.675µs) from=127.0.0.1:45796
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.752662 [DEBUG] http: Request OPTIONS /v1/agent/check/register (16.667µs) from=127.0.0.1:45796
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.755607 [DEBUG] http: Request GET /v1/config/ (1.113025ms) from=127.0.0.1:45796
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.759073 [ERR] http: Request PUT /v1/config/, error: method PUT not allowed from=127.0.0.1:45798
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.759915 [DEBUG] http: Request PUT /v1/config/ (840.687µs) from=127.0.0.1:45798
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.763183 [ERR] http: Request POST /v1/config/, error: method POST not allowed from=127.0.0.1:45800
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.763774 [DEBUG] http: Request POST /v1/config/ (600.014µs) from=127.0.0.1:45800
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.772970 [DEBUG] http: Request DELETE /v1/config/ (537.012µs) from=127.0.0.1:45802
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.776920 [ERR] http: Request HEAD /v1/config/, error: method HEAD not allowed from=127.0.0.1:45804
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.777078 [DEBUG] http: Request HEAD /v1/config/ (174.671µs) from=127.0.0.1:45804
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/config/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.778843 [DEBUG] http: Request OPTIONS /v1/config/ (19.667µs) from=127.0.0.1:45804
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.781062 [ERR] http: Request GET /v1/connect/intentions/, error: Bad request: failed intention lookup: index error: UUID must be 36 characters from=127.0.0.1:45804
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.781783 [DEBUG] http: Request GET /v1/connect/intentions/ (1.150693ms) from=127.0.0.1:45804
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.786550 [DEBUG] http: Request PUT /v1/connect/intentions/ (905.021µs) from=127.0.0.1:45806
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.789990 [ERR] http: Request POST /v1/connect/intentions/, error: method POST not allowed from=127.0.0.1:45808
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.790821 [DEBUG] http: Request POST /v1/connect/intentions/ (826.352µs) from=127.0.0.1:45808
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.797585 [ERR] http: Request DELETE /v1/connect/intentions/, error: Intention lookup failed: failed intention lookup: index error: UUID must be 36 characters from=127.0.0.1:45810
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.798398 [DEBUG] http: Request DELETE /v1/connect/intentions/ (1.325031ms) from=127.0.0.1:45810
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.802200 [ERR] http: Request HEAD /v1/connect/intentions/, error: method HEAD not allowed from=127.0.0.1:45812
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.802514 [DEBUG] http: Request HEAD /v1/connect/intentions/ (343.342µs) from=127.0.0.1:45812
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.804447 [DEBUG] http: Request OPTIONS /v1/connect/intentions/ (16.001µs) from=127.0.0.1:45812
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.806586 [ERR] http: Request GET /v1/acl/role/, error: Bad request: Missing role ID from=127.0.0.1:45812
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.807364 [DEBUG] http: Request GET /v1/acl/role/ (789.018µs) from=127.0.0.1:45812
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.810981 [ERR] http: Request PUT /v1/acl/role/, error: Bad request: Role decoding failed: EOF from=127.0.0.1:45814
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.811917 [DEBUG] http: Request PUT /v1/acl/role/ (1.04369ms) from=127.0.0.1:45814
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.816706 [ERR] http: Request POST /v1/acl/role/, error: method POST not allowed from=127.0.0.1:45816
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.817623 [DEBUG] http: Request POST /v1/acl/role/ (913.021µs) from=127.0.0.1:45816
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.822355 [ERR] http: Request DELETE /v1/acl/role/, error: Bad request: Missing role ID from=127.0.0.1:45818
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.823258 [DEBUG] http: Request DELETE /v1/acl/role/ (895.02µs) from=127.0.0.1:45818
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.828252 [ERR] http: Request HEAD /v1/acl/role/, error: method HEAD not allowed from=127.0.0.1:45820
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.828479 [DEBUG] http: Request HEAD /v1/acl/role/ (240.339µs) from=127.0.0.1:45820
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.830986 [DEBUG] http: Request OPTIONS /v1/acl/role/ (21.667µs) from=127.0.0.1:45820
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.833683 [ERR] http: Request GET /v1/acl/auth-methods, error: Permission denied from=127.0.0.1:45820
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.834507 [DEBUG] http: Request GET /v1/acl/auth-methods (1.371365ms) from=127.0.0.1:45820
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.837996 [ERR] http: Request PUT /v1/acl/auth-methods, error: method PUT not allowed from=127.0.0.1:45822
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.838923 [DEBUG] http: Request PUT /v1/acl/auth-methods (937.022µs) from=127.0.0.1:45822
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.843174 [ERR] http: Request POST /v1/acl/auth-methods, error: method POST not allowed from=127.0.0.1:45824
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.843886 [DEBUG] http: Request POST /v1/acl/auth-methods (698.349µs) from=127.0.0.1:45824
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.847831 [ERR] http: Request DELETE /v1/acl/auth-methods, error: method DELETE not allowed from=127.0.0.1:45826
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.848492 [DEBUG] http: Request DELETE /v1/acl/auth-methods (664.682µs) from=127.0.0.1:45826
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.851744 [ERR] http: Request HEAD /v1/acl/auth-methods, error: method HEAD not allowed from=127.0.0.1:45828
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.851890 [DEBUG] http: Request HEAD /v1/acl/auth-methods (173.338µs) from=127.0.0.1:45828
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-methods
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.853540 [DEBUG] http: Request OPTIONS /v1/acl/auth-methods (17.667µs) from=127.0.0.1:45828
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.855189 [ERR] http: Request GET /v1/acl/token, error: method GET not allowed from=127.0.0.1:45828
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.855728 [DEBUG] http: Request GET /v1/acl/token (547.012µs) from=127.0.0.1:45828
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.858694 [ERR] http: Request PUT /v1/acl/token, error: Bad request: Token decoding failed: EOF from=127.0.0.1:45830
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.859351 [DEBUG] http: Request PUT /v1/acl/token (677.015µs) from=127.0.0.1:45830
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.863275 [ERR] http: Request POST /v1/acl/token, error: method POST not allowed from=127.0.0.1:45832
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.864072 [DEBUG] http: Request POST /v1/acl/token (854.686µs) from=127.0.0.1:45832
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.870116 [ERR] http: Request DELETE /v1/acl/token, error: method DELETE not allowed from=127.0.0.1:45834
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.871239 [DEBUG] http: Request DELETE /v1/acl/token (1.15336ms) from=127.0.0.1:45834
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.876940 [ERR] http: Request HEAD /v1/acl/token, error: method HEAD not allowed from=127.0.0.1:45836
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.877103 [DEBUG] http: Request HEAD /v1/acl/token (185.337µs) from=127.0.0.1:45836
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.878882 [DEBUG] http: Request OPTIONS /v1/acl/token (15.334µs) from=127.0.0.1:45836
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.881147 [DEBUG] http: Request GET /v1/agent/connect/proxy/ (536.346µs) from=127.0.0.1:45836
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.885830 [ERR] http: Request PUT /v1/agent/connect/proxy/, error: method PUT not allowed from=127.0.0.1:45838
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.886857 [DEBUG] http: Request PUT /v1/agent/connect/proxy/ (1.00269ms) from=127.0.0.1:45838
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.890294 [ERR] http: Request POST /v1/agent/connect/proxy/, error: method POST not allowed from=127.0.0.1:45840
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.891080 [DEBUG] http: Request POST /v1/agent/connect/proxy/ (777.351µs) from=127.0.0.1:45840
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.894663 [ERR] http: Request DELETE /v1/agent/connect/proxy/, error: method DELETE not allowed from=127.0.0.1:45842
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.895424 [DEBUG] http: Request DELETE /v1/agent/connect/proxy/ (758.018µs) from=127.0.0.1:45842
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.898633 [ERR] http: Request HEAD /v1/agent/connect/proxy/, error: method HEAD not allowed from=127.0.0.1:45844
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.898847 [DEBUG] http: Request HEAD /v1/agent/connect/proxy/ (230.339µs) from=127.0.0.1:45844
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.900622 [DEBUG] http: Request OPTIONS /v1/agent/connect/proxy/ (19.001µs) from=127.0.0.1:45844
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.902733 [ERR] http: Request GET /v1/operator/raft/configuration, error: Permission denied from=127.0.0.1:45844
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.903502 [DEBUG] http: Request GET /v1/operator/raft/configuration (1.209695ms) from=127.0.0.1:45844
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.907007 [ERR] http: Request PUT /v1/operator/raft/configuration, error: method PUT not allowed from=127.0.0.1:45846
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.907727 [DEBUG] http: Request PUT /v1/operator/raft/configuration (719.35µs) from=127.0.0.1:45846
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.911086 [ERR] http: Request POST /v1/operator/raft/configuration, error: method POST not allowed from=127.0.0.1:45848
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.911791 [DEBUG] http: Request POST /v1/operator/raft/configuration (705.017µs) from=127.0.0.1:45848
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.916006 [ERR] http: Request DELETE /v1/operator/raft/configuration, error: method DELETE not allowed from=127.0.0.1:45850
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.916717 [DEBUG] http: Request DELETE /v1/operator/raft/configuration (708.016µs) from=127.0.0.1:45850
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.920114 [ERR] http: Request HEAD /v1/operator/raft/configuration, error: method HEAD not allowed from=127.0.0.1:45852
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.920277 [DEBUG] http: Request HEAD /v1/operator/raft/configuration (174.671µs) from=127.0.0.1:45852
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.921848 [DEBUG] http: Request OPTIONS /v1/operator/raft/configuration (16.667µs) from=127.0.0.1:45852
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.924482 [ERR] http: Request GET /v1/operator/keyring, error: Reading keyring denied by ACLs from=127.0.0.1:45852
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.925610 [DEBUG] http: Request GET /v1/operator/keyring (1.819042ms) from=127.0.0.1:45852
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.929707 [DEBUG] http: Request PUT /v1/operator/keyring (673.016µs) from=127.0.0.1:45854
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.934743 [DEBUG] http: Request POST /v1/operator/keyring (632.681µs) from=127.0.0.1:45856
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.940004 [DEBUG] http: Request DELETE /v1/operator/keyring (613.347µs) from=127.0.0.1:45858
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.943434 [ERR] http: Request HEAD /v1/operator/keyring, error: method HEAD not allowed from=127.0.0.1:45860
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.943686 [DEBUG] http: Request HEAD /v1/operator/keyring (277.34µs) from=127.0.0.1:45860
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.947689 [DEBUG] http: Request OPTIONS /v1/operator/keyring (24.334µs) from=127.0.0.1:45860
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.949647 [ERR] http: Request GET /v1/acl/auth-method, error: method GET not allowed from=127.0.0.1:45860
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.950320 [DEBUG] http: Request GET /v1/acl/auth-method (671.016µs) from=127.0.0.1:45860
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.954318 [ERR] http: Request PUT /v1/acl/auth-method, error: Bad request: AuthMethod decoding failed: EOF from=127.0.0.1:45862
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.955120 [DEBUG] http: Request PUT /v1/acl/auth-method (964.022µs) from=127.0.0.1:45862
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.959493 [ERR] http: Request POST /v1/acl/auth-method, error: method POST not allowed from=127.0.0.1:45864
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.960170 [DEBUG] http: Request POST /v1/acl/auth-method (725.351µs) from=127.0.0.1:45864
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.963722 [ERR] http: Request DELETE /v1/acl/auth-method, error: method DELETE not allowed from=127.0.0.1:45866
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.964793 [DEBUG] http: Request DELETE /v1/acl/auth-method (1.065692ms) from=127.0.0.1:45866
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.968535 [ERR] http: Request HEAD /v1/acl/auth-method, error: method HEAD not allowed from=127.0.0.1:45868
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.968748 [DEBUG] http: Request HEAD /v1/acl/auth-method (179.671µs) from=127.0.0.1:45868
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-method
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.970579 [DEBUG] http: Request OPTIONS /v1/acl/auth-method (18.334µs) from=127.0.0.1:45868
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.972586 [ERR] http: Request GET /v1/acl/token/self, error: ACL not found from=127.0.0.1:45868
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.973327 [DEBUG] http: Request GET /v1/acl/token/self (1.161693ms) from=127.0.0.1:45868
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.977365 [ERR] http: Request PUT /v1/acl/token/self, error: method PUT not allowed from=127.0.0.1:45870
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.978051 [DEBUG] http: Request PUT /v1/acl/token/self (695.016µs) from=127.0.0.1:45870
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.981637 [ERR] http: Request POST /v1/acl/token/self, error: method POST not allowed from=127.0.0.1:45872
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.982329 [DEBUG] http: Request POST /v1/acl/token/self (692.683µs) from=127.0.0.1:45872
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.986229 [ERR] http: Request DELETE /v1/acl/token/self, error: method DELETE not allowed from=127.0.0.1:45874
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.986902 [DEBUG] http: Request DELETE /v1/acl/token/self (679.015µs) from=127.0.0.1:45874
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.990546 [ERR] http: Request HEAD /v1/acl/token/self, error: method HEAD not allowed from=127.0.0.1:45876
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.990798 [DEBUG] http: Request HEAD /v1/acl/token/self (381.009µs) from=127.0.0.1:45876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.992497 [DEBUG] http: Request OPTIONS /v1/acl/token/self (19.667µs) from=127.0.0.1:45876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.995964 [DEBUG] http: Request GET /v1/coordinate/datacenters (1.805709ms) from=127.0.0.1:45876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:58.999568 [ERR] http: Request PUT /v1/coordinate/datacenters, error: method PUT not allowed from=127.0.0.1:45878
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.000313 [DEBUG] http: Request PUT /v1/coordinate/datacenters (729.351µs) from=127.0.0.1:45878
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.003996 [ERR] http: Request POST /v1/coordinate/datacenters, error: method POST not allowed from=127.0.0.1:45880
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.004832 [DEBUG] http: Request POST /v1/coordinate/datacenters (798.685µs) from=127.0.0.1:45880
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.008627 [ERR] http: Request DELETE /v1/coordinate/datacenters, error: method DELETE not allowed from=127.0.0.1:45882
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.009422 [DEBUG] http: Request DELETE /v1/coordinate/datacenters (782.351µs) from=127.0.0.1:45882
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.014344 [ERR] http: Request HEAD /v1/coordinate/datacenters, error: method HEAD not allowed from=127.0.0.1:45884
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.014556 [DEBUG] http: Request HEAD /v1/coordinate/datacenters (233.339µs) from=127.0.0.1:45884
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.017259 [DEBUG] http: Request OPTIONS /v1/coordinate/datacenters (21.334µs) from=127.0.0.1:45884
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.019698 [ERR] http: Request GET /v1/coordinate/update, error: method GET not allowed from=127.0.0.1:45884
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.020558 [DEBUG] http: Request GET /v1/coordinate/update (839.019µs) from=127.0.0.1:45884
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.024868 [DEBUG] http: Request PUT /v1/coordinate/update (806.352µs) from=127.0.0.1:45886
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.028426 [ERR] http: Request POST /v1/coordinate/update, error: method POST not allowed from=127.0.0.1:45888
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.029130 [DEBUG] http: Request POST /v1/coordinate/update (712.683µs) from=127.0.0.1:45888
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.033145 [ERR] http: Request DELETE /v1/coordinate/update, error: method DELETE not allowed from=127.0.0.1:45890
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.033837 [DEBUG] http: Request DELETE /v1/coordinate/update (695.683µs) from=127.0.0.1:45890
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.039349 [ERR] http: Request HEAD /v1/coordinate/update, error: method HEAD not allowed from=127.0.0.1:45892
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.039530 [DEBUG] http: Request HEAD /v1/coordinate/update (196.005µs) from=127.0.0.1:45892
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.041348 [DEBUG] http: Request OPTIONS /v1/coordinate/update (15.334µs) from=127.0.0.1:45892
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.043996 [DEBUG] http: Request GET /v1/health/state/ (528.678µs) from=127.0.0.1:45892
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.047597 [ERR] http: Request PUT /v1/health/state/, error: method PUT not allowed from=127.0.0.1:45894
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.048460 [DEBUG] http: Request PUT /v1/health/state/ (749.684µs) from=127.0.0.1:45894
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.052476 [ERR] http: Request POST /v1/health/state/, error: method POST not allowed from=127.0.0.1:45896
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.053243 [DEBUG] http: Request POST /v1/health/state/ (771.351µs) from=127.0.0.1:45896
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.057322 [ERR] http: Request DELETE /v1/health/state/, error: method DELETE not allowed from=127.0.0.1:45898
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.058114 [DEBUG] http: Request DELETE /v1/health/state/ (786.685µs) from=127.0.0.1:45898
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.061824 [ERR] http: Request HEAD /v1/health/state/, error: method HEAD not allowed from=127.0.0.1:45900
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.061974 [DEBUG] http: Request HEAD /v1/health/state/ (179.671µs) from=127.0.0.1:45900
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.063967 [DEBUG] http: Request OPTIONS /v1/health/state/ (17.667µs) from=127.0.0.1:45900
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.068375 [ERR] http: Request GET /v1/catalog/deregister, error: method GET not allowed from=127.0.0.1:45900
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.069287 [DEBUG] http: Request GET /v1/catalog/deregister (814.019µs) from=127.0.0.1:45900
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.073579 [DEBUG] http: Request PUT /v1/catalog/deregister (607.014µs) from=127.0.0.1:45902
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.077754 [ERR] http: Request POST /v1/catalog/deregister, error: method POST not allowed from=127.0.0.1:45904
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.078518 [DEBUG] http: Request POST /v1/catalog/deregister (754.351µs) from=127.0.0.1:45904
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.082514 [ERR] http: Request DELETE /v1/catalog/deregister, error: method DELETE not allowed from=127.0.0.1:45906
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.083212 [DEBUG] http: Request DELETE /v1/catalog/deregister (759.684µs) from=127.0.0.1:45906
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.087047 [ERR] http: Request HEAD /v1/catalog/deregister, error: method HEAD not allowed from=127.0.0.1:45908
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.087387 [DEBUG] http: Request HEAD /v1/catalog/deregister (248.672µs) from=127.0.0.1:45908
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.091360 [DEBUG] http: Request OPTIONS /v1/catalog/deregister (13.667µs) from=127.0.0.1:45908
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.097189 [ERR] http: Request GET /v1/acl/roles, error: Permission denied from=127.0.0.1:45908
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.097764 [DEBUG] http: Request GET /v1/acl/roles (1.282696ms) from=127.0.0.1:45908
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.101737 [ERR] http: Request PUT /v1/acl/roles, error: method PUT not allowed from=127.0.0.1:45910
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.102449 [DEBUG] http: Request PUT /v1/acl/roles (711.683µs) from=127.0.0.1:45910
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.105841 [ERR] http: Request POST /v1/acl/roles, error: method POST not allowed from=127.0.0.1:45912
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.106650 [DEBUG] http: Request POST /v1/acl/roles (815.352µs) from=127.0.0.1:45912
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.109967 [ERR] http: Request DELETE /v1/acl/roles, error: method DELETE not allowed from=127.0.0.1:45914
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.110537 [DEBUG] http: Request DELETE /v1/acl/roles (573.68µs) from=127.0.0.1:45914
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.114661 [ERR] http: Request HEAD /v1/acl/roles, error: method HEAD not allowed from=127.0.0.1:45916
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.114929 [DEBUG] http: Request HEAD /v1/acl/roles (257.673µs) from=127.0.0.1:45916
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/roles
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.118066 [DEBUG] http: Request OPTIONS /v1/acl/roles (17.333µs) from=127.0.0.1:45916
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.122533 [ERR] http: Request GET /v1/agent/check/fail/, error: method GET not allowed from=127.0.0.1:45916
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.123348 [DEBUG] http: Request GET /v1/agent/check/fail/ (819.352µs) from=127.0.0.1:45916
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.126916 [ERR] http: Request PUT /v1/agent/check/fail/, error: Unknown check "" from=127.0.0.1:45918
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.127612 [DEBUG] http: Request PUT /v1/agent/check/fail/ (834.686µs) from=127.0.0.1:45918
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.132137 [ERR] http: Request POST /v1/agent/check/fail/, error: method POST not allowed from=127.0.0.1:45920
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.132979 [DEBUG] http: Request POST /v1/agent/check/fail/ (836.019µs) from=127.0.0.1:45920
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.136599 [ERR] http: Request DELETE /v1/agent/check/fail/, error: method DELETE not allowed from=127.0.0.1:45922
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.137444 [DEBUG] http: Request DELETE /v1/agent/check/fail/ (843.686µs) from=127.0.0.1:45922
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.140950 [ERR] http: Request HEAD /v1/agent/check/fail/, error: method HEAD not allowed from=127.0.0.1:45924
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.141168 [DEBUG] http: Request HEAD /v1/agent/check/fail/ (236.673µs) from=127.0.0.1:45924
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.143006 [DEBUG] http: Request OPTIONS /v1/agent/check/fail/ (21.334µs) from=127.0.0.1:45924
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.147291 [DEBUG] http: Request GET /v1/session/info/ (577.347µs) from=127.0.0.1:45924
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.150944 [ERR] http: Request PUT /v1/session/info/, error: method PUT not allowed from=127.0.0.1:45926
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.151658 [DEBUG] http: Request PUT /v1/session/info/ (723.017µs) from=127.0.0.1:45926
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.154916 [ERR] http: Request POST /v1/session/info/, error: method POST not allowed from=127.0.0.1:45928
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.155599 [DEBUG] http: Request POST /v1/session/info/ (693.016µs) from=127.0.0.1:45928
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.158957 [ERR] http: Request DELETE /v1/session/info/, error: method DELETE not allowed from=127.0.0.1:45930
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.159758 [DEBUG] http: Request DELETE /v1/session/info/ (819.019µs) from=127.0.0.1:45930
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.162728 [ERR] http: Request HEAD /v1/session/info/, error: method HEAD not allowed from=127.0.0.1:45932
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.162870 [DEBUG] http: Request HEAD /v1/session/info/ (158.67µs) from=127.0.0.1:45932
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.164426 [DEBUG] http: Request OPTIONS /v1/session/info/ (14.667µs) from=127.0.0.1:45932
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.165881 [ERR] http: Request GET /v1/acl/role, error: method GET not allowed from=127.0.0.1:45932
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.166394 [DEBUG] http: Request GET /v1/acl/role (521.012µs) from=127.0.0.1:45932
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.169151 [ERR] http: Request PUT /v1/acl/role, error: Bad request: Role decoding failed: EOF from=127.0.0.1:45934
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.171880 [DEBUG] http: Request PUT /v1/acl/role (2.74673ms) from=127.0.0.1:45934
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.176397 [ERR] http: Request POST /v1/acl/role, error: method POST not allowed from=127.0.0.1:45936
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.176973 [DEBUG] http: Request POST /v1/acl/role (583.347µs) from=127.0.0.1:45936
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.181029 [ERR] http: Request DELETE /v1/acl/role, error: method DELETE not allowed from=127.0.0.1:45938
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.181536 [DEBUG] http: Request DELETE /v1/acl/role (509.679µs) from=127.0.0.1:45938
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.187766 [ERR] http: Request HEAD /v1/acl/role, error: method HEAD not allowed from=127.0.0.1:45940
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.187927 [DEBUG] http: Request HEAD /v1/acl/role (173.337µs) from=127.0.0.1:45940
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.190103 [DEBUG] http: Request OPTIONS /v1/acl/role (18.334µs) from=127.0.0.1:45940
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.192625 [ERR] http: Request GET /v1/acl/role/name/, error: Bad request: Missing role Name from=127.0.0.1:45940
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.193281 [DEBUG] http: Request GET /v1/acl/role/name/ (667.349µs) from=127.0.0.1:45940
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.196921 [ERR] http: Request PUT /v1/acl/role/name/, error: method PUT not allowed from=127.0.0.1:45942
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.197508 [DEBUG] http: Request PUT /v1/acl/role/name/ (599.347µs) from=127.0.0.1:45942
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.200483 [ERR] http: Request POST /v1/acl/role/name/, error: method POST not allowed from=127.0.0.1:45944
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.201169 [DEBUG] http: Request POST /v1/acl/role/name/ (682.349µs) from=127.0.0.1:45944
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.204495 [ERR] http: Request DELETE /v1/acl/role/name/, error: method DELETE not allowed from=127.0.0.1:45946
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.206430 [DEBUG] http: Request DELETE /v1/acl/role/name/ (1.928378ms) from=127.0.0.1:45946
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.212737 [ERR] http: Request HEAD /v1/acl/role/name/, error: method HEAD not allowed from=127.0.0.1:45948
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.212890 [DEBUG] http: Request HEAD /v1/acl/role/name/ (167.003µs) from=127.0.0.1:45948
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.214619 [DEBUG] http: Request OPTIONS /v1/acl/role/name/ (15.334µs) from=127.0.0.1:45948
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.216346 [ERR] http: Request GET /v1/acl/rules/translate/, error: Bad request: Missing token ID from=127.0.0.1:45948
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.216869 [DEBUG] http: Request GET /v1/acl/rules/translate/ (529.679µs) from=127.0.0.1:45948
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.220088 [ERR] http: Request PUT /v1/acl/rules/translate/, error: method PUT not allowed from=127.0.0.1:45950
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.220628 [DEBUG] http: Request PUT /v1/acl/rules/translate/ (549.346µs) from=127.0.0.1:45950
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.224771 [ERR] http: Request POST /v1/acl/rules/translate/, error: method POST not allowed from=127.0.0.1:45952
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.225344 [DEBUG] http: Request POST /v1/acl/rules/translate/ (582.014µs) from=127.0.0.1:45952
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.229417 [ERR] http: Request DELETE /v1/acl/rules/translate/, error: method DELETE not allowed from=127.0.0.1:45954
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.230186 [DEBUG] http: Request DELETE /v1/acl/rules/translate/ (773.018µs) from=127.0.0.1:45954
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.234434 [ERR] http: Request HEAD /v1/acl/rules/translate/, error: method HEAD not allowed from=127.0.0.1:45956
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.234589 [DEBUG] http: Request HEAD /v1/acl/rules/translate/ (173.338µs) from=127.0.0.1:45956
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.236053 [DEBUG] http: Request OPTIONS /v1/acl/rules/translate/ (16.001µs) from=127.0.0.1:45956
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.239508 [DEBUG] http: Request GET /v1/connect/ca/roots (1.928378ms) from=127.0.0.1:45956
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.244168 [ERR] http: Request PUT /v1/connect/ca/roots, error: method PUT not allowed from=127.0.0.1:45958
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.244787 [DEBUG] http: Request PUT /v1/connect/ca/roots (619.348µs) from=127.0.0.1:45958
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.247798 [ERR] http: Request POST /v1/connect/ca/roots, error: method POST not allowed from=127.0.0.1:45960
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.248416 [DEBUG] http: Request POST /v1/connect/ca/roots (617.014µs) from=127.0.0.1:45960
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.251731 [ERR] http: Request DELETE /v1/connect/ca/roots, error: method DELETE not allowed from=127.0.0.1:45962
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.252980 [DEBUG] http: Request DELETE /v1/connect/ca/roots (1.240362ms) from=127.0.0.1:45962
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.256749 [ERR] http: Request HEAD /v1/connect/ca/roots, error: method HEAD not allowed from=127.0.0.1:45964
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.257008 [DEBUG] http: Request HEAD /v1/connect/ca/roots (272.007µs) from=127.0.0.1:45964
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.258954 [DEBUG] http: Request OPTIONS /v1/connect/ca/roots (17.668µs) from=127.0.0.1:45964
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.265742 [ERR] http: Request GET /v1/txn, error: method GET not allowed from=127.0.0.1:45964
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.266440 [DEBUG] http: Request GET /v1/txn (707.683µs) from=127.0.0.1:45964
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.270338 [DEBUG] http: Request PUT /v1/txn (565.346µs) from=127.0.0.1:45966
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.273549 [ERR] http: Request POST /v1/txn, error: method POST not allowed from=127.0.0.1:45968
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.274515 [DEBUG] http: Request POST /v1/txn (956.689µs) from=127.0.0.1:45968
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.278402 [ERR] http: Request DELETE /v1/txn, error: method DELETE not allowed from=127.0.0.1:45970
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.279041 [DEBUG] http: Request DELETE /v1/txn (639.681µs) from=127.0.0.1:45970
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.282892 [ERR] http: Request HEAD /v1/txn, error: method HEAD not allowed from=127.0.0.1:45972
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.283056 [DEBUG] http: Request HEAD /v1/txn (185.67µs) from=127.0.0.1:45972
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.284857 [DEBUG] http: Request OPTIONS /v1/txn (17.334µs) from=127.0.0.1:45972
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.287549 [ERR] http: Request GET /v1/agent/force-leave/, error: method GET not allowed from=127.0.0.1:45972
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.288118 [DEBUG] http: Request GET /v1/agent/force-leave/ (570.347µs) from=127.0.0.1:45972
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.293058 [ERR] http: Request PUT /v1/agent/force-leave/, error: Permission denied from=127.0.0.1:45974
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.293599 [DEBUG] http: Request PUT /v1/agent/force-leave/ (671.683µs) from=127.0.0.1:45974
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.297925 [ERR] http: Request POST /v1/agent/force-leave/, error: method POST not allowed from=127.0.0.1:45976
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.298573 [DEBUG] http: Request POST /v1/agent/force-leave/ (662.015µs) from=127.0.0.1:45976
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.303825 [ERR] http: Request DELETE /v1/agent/force-leave/, error: method DELETE not allowed from=127.0.0.1:45978
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.304545 [DEBUG] http: Request DELETE /v1/agent/force-leave/ (713.016µs) from=127.0.0.1:45978
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.307831 [ERR] http: Request HEAD /v1/agent/force-leave/, error: method HEAD not allowed from=127.0.0.1:45980
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.308046 [DEBUG] http: Request HEAD /v1/agent/force-leave/ (241.005µs) from=127.0.0.1:45980
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.309732 [DEBUG] http: Request OPTIONS /v1/agent/force-leave/ (21µs) from=127.0.0.1:45980
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.311292 [ERR] http: Request GET /v1/agent/check/update/, error: method GET not allowed from=127.0.0.1:45980
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.311902 [DEBUG] http: Request GET /v1/agent/check/update/ (610.014µs) from=127.0.0.1:45980
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.316018 [DEBUG] http: Request PUT /v1/agent/check/update/ (521.678µs) from=127.0.0.1:45982
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.319567 [ERR] http: Request POST /v1/agent/check/update/, error: method POST not allowed from=127.0.0.1:45984
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.320230 [DEBUG] http: Request POST /v1/agent/check/update/ (660.348µs) from=127.0.0.1:45984
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.323516 [ERR] http: Request DELETE /v1/agent/check/update/, error: method DELETE not allowed from=127.0.0.1:45986
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.324375 [DEBUG] http: Request DELETE /v1/agent/check/update/ (857.354µs) from=127.0.0.1:45986
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.327659 [ERR] http: Request HEAD /v1/agent/check/update/, error: method HEAD not allowed from=127.0.0.1:45988
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.327945 [DEBUG] http: Request HEAD /v1/agent/check/update/ (303.34µs) from=127.0.0.1:45988
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.330451 [DEBUG] http: Request OPTIONS /v1/agent/check/update/ (82.335µs) from=127.0.0.1:45988
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.332666 [DEBUG] http: Request GET /v1/kv/ (498.678µs) from=127.0.0.1:45988
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.336731 [DEBUG] http: Request PUT /v1/kv/ (501.678µs) from=127.0.0.1:45990
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.340186 [ERR] http: Request POST /v1/kv/, error: method POST not allowed from=127.0.0.1:45992
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.340936 [DEBUG] http: Request POST /v1/kv/ (744.684µs) from=127.0.0.1:45992
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.344676 [DEBUG] http: Request DELETE /v1/kv/ (614.015µs) from=127.0.0.1:45994
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.348220 [ERR] http: Request HEAD /v1/kv/, error: method HEAD not allowed from=127.0.0.1:45996
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.348364 [DEBUG] http: Request HEAD /v1/kv/ (161.337µs) from=127.0.0.1:45996
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.350229 [DEBUG] http: Request OPTIONS /v1/kv/ (15.334µs) from=127.0.0.1:45996
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.352081 [ERR] http: Request GET /v1/operator/raft/peer, error: method GET not allowed from=127.0.0.1:45996
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.352709 [DEBUG] http: Request GET /v1/operator/raft/peer (628.348µs) from=127.0.0.1:45996
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.356591 [ERR] http: Request PUT /v1/operator/raft/peer, error: method PUT not allowed from=127.0.0.1:45998
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.357396 [DEBUG] http: Request PUT /v1/operator/raft/peer (812.353µs) from=127.0.0.1:45998
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.360732 [ERR] http: Request POST /v1/operator/raft/peer, error: method POST not allowed from=127.0.0.1:46000
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.361497 [DEBUG] http: Request POST /v1/operator/raft/peer (768.684µs) from=127.0.0.1:46000
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.365298 [DEBUG] http: Request DELETE /v1/operator/raft/peer (420.01µs) from=127.0.0.1:46002
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.372463 [ERR] http: Request HEAD /v1/operator/raft/peer, error: method HEAD not allowed from=127.0.0.1:46004
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.372823 [DEBUG] http: Request HEAD /v1/operator/raft/peer (373.008µs) from=127.0.0.1:46004
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.374920 [DEBUG] http: Request OPTIONS /v1/operator/raft/peer (16µs) from=127.0.0.1:46004
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.377298 [ERR] http: Request GET /v1/acl/policies, error: Permission denied from=127.0.0.1:46004
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.378014 [DEBUG] http: Request GET /v1/acl/policies (1.189027ms) from=127.0.0.1:46004
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.381857 [ERR] http: Request PUT /v1/acl/policies, error: method PUT not allowed from=127.0.0.1:46006
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.382658 [DEBUG] http: Request PUT /v1/acl/policies (906.355µs) from=127.0.0.1:46006
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.385942 [ERR] http: Request POST /v1/acl/policies, error: method POST not allowed from=127.0.0.1:46008
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.386691 [DEBUG] http: Request POST /v1/acl/policies (742.017µs) from=127.0.0.1:46008
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.389901 [ERR] http: Request DELETE /v1/acl/policies, error: method DELETE not allowed from=127.0.0.1:46010
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.390562 [DEBUG] http: Request DELETE /v1/acl/policies (661.682µs) from=127.0.0.1:46010
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.394106 [ERR] http: Request HEAD /v1/acl/policies, error: method HEAD not allowed from=127.0.0.1:46012
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.394456 [DEBUG] http: Request HEAD /v1/acl/policies (409.676µs) from=127.0.0.1:46012
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.396469 [DEBUG] http: Request OPTIONS /v1/acl/policies (17.001µs) from=127.0.0.1:46012
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.398983 [ERR] http: Request GET /v1/coordinate/node/, error: Permission denied from=127.0.0.1:46012
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.399804 [DEBUG] http: Request GET /v1/coordinate/node/ (1.311031ms) from=127.0.0.1:46012
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.403607 [ERR] http: Request PUT /v1/coordinate/node/, error: method PUT not allowed from=127.0.0.1:46014
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.404501 [DEBUG] http: Request PUT /v1/coordinate/node/ (950.022µs) from=127.0.0.1:46014
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.408813 [ERR] http: Request POST /v1/coordinate/node/, error: method POST not allowed from=127.0.0.1:46016
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.409711 [DEBUG] http: Request POST /v1/coordinate/node/ (900.687µs) from=127.0.0.1:46016
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.413371 [ERR] http: Request DELETE /v1/coordinate/node/, error: method DELETE not allowed from=127.0.0.1:46018
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.413978 [DEBUG] http: Request DELETE /v1/coordinate/node/ (627.014µs) from=127.0.0.1:46018
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.418484 [ERR] http: Request HEAD /v1/coordinate/node/, error: method HEAD not allowed from=127.0.0.1:46020
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.418632 [DEBUG] http: Request HEAD /v1/coordinate/node/ (173.004µs) from=127.0.0.1:46020
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.420889 [DEBUG] http: Request OPTIONS /v1/coordinate/node/ (19.001µs) from=127.0.0.1:46020
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.423100 [ERR] http: Request GET /v1/operator/autopilot/configuration, error: Permission denied from=127.0.0.1:46020
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.423865 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (1.227695ms) from=127.0.0.1:46020
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.434804 [DEBUG] http: Request PUT /v1/operator/autopilot/configuration (775.685µs) from=127.0.0.1:46022
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.439150 [ERR] http: Request POST /v1/operator/autopilot/configuration, error: method POST not allowed from=127.0.0.1:46024
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.439985 [DEBUG] http: Request POST /v1/operator/autopilot/configuration (845.686µs) from=127.0.0.1:46024
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.443605 [ERR] http: Request DELETE /v1/operator/autopilot/configuration, error: method DELETE not allowed from=127.0.0.1:46026
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.444494 [DEBUG] http: Request DELETE /v1/operator/autopilot/configuration (885.02µs) from=127.0.0.1:46026
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.448669 [ERR] http: Request HEAD /v1/operator/autopilot/configuration, error: method HEAD not allowed from=127.0.0.1:46028
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.448841 [DEBUG] http: Request HEAD /v1/operator/autopilot/configuration (198.338µs) from=127.0.0.1:46028
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.451071 [DEBUG] http: Request OPTIONS /v1/operator/autopilot/configuration (19µs) from=127.0.0.1:46028
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.452741 [ERR] http: Request GET /v1/acl/auth-method/, error: Bad request: Missing auth method name from=127.0.0.1:46028
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.453390 [DEBUG] http: Request GET /v1/acl/auth-method/ (641.015µs) from=127.0.0.1:46028
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.457449 [ERR] http: Request PUT /v1/acl/auth-method/, error: Bad request: AuthMethod decoding failed: EOF from=127.0.0.1:46030
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.458357 [DEBUG] http: Request PUT /v1/acl/auth-method/ (951.022µs) from=127.0.0.1:46030
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.463589 [ERR] http: Request POST /v1/acl/auth-method/, error: method POST not allowed from=127.0.0.1:46032
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.464466 [DEBUG] http: Request POST /v1/acl/auth-method/ (877.354µs) from=127.0.0.1:46032
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.468799 [ERR] http: Request DELETE /v1/acl/auth-method/, error: Bad request: Missing auth method name from=127.0.0.1:46034
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.469728 [DEBUG] http: Request DELETE /v1/acl/auth-method/ (821.685µs) from=127.0.0.1:46034
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.473383 [ERR] http: Request HEAD /v1/acl/auth-method/, error: method HEAD not allowed from=127.0.0.1:46036
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.473699 [DEBUG] http: Request HEAD /v1/acl/auth-method/ (336.341µs) from=127.0.0.1:46036
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-method/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.475848 [DEBUG] http: Request OPTIONS /v1/acl/auth-method/ (18.334µs) from=127.0.0.1:46036
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.477876 [ERR] http: Request GET /v1/agent/host, error: Permission denied from=127.0.0.1:46036
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.478781 [DEBUG] http: Request GET /v1/agent/host (1.040024ms) from=127.0.0.1:46036
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.483455 [ERR] http: Request PUT /v1/agent/host, error: method PUT not allowed from=127.0.0.1:46038
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.484345 [DEBUG] http: Request PUT /v1/agent/host (895.688µs) from=127.0.0.1:46038
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.488909 [ERR] http: Request POST /v1/agent/host, error: method POST not allowed from=127.0.0.1:46040
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.489728 [DEBUG] http: Request POST /v1/agent/host (922.354µs) from=127.0.0.1:46040
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.494267 [ERR] http: Request DELETE /v1/agent/host, error: method DELETE not allowed from=127.0.0.1:46042
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.494925 [DEBUG] http: Request DELETE /v1/agent/host (718.683µs) from=127.0.0.1:46042
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.498908 [ERR] http: Request HEAD /v1/agent/host, error: method HEAD not allowed from=127.0.0.1:46044
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.499246 [DEBUG] http: Request HEAD /v1/agent/host (289.673µs) from=127.0.0.1:46044
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.501519 [DEBUG] http: Request OPTIONS /v1/agent/host (17.001µs) from=127.0.0.1:46044
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.504632 [DEBUG] http: Request GET /v1/agent/checks (1.549369ms) from=127.0.0.1:46044
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.511948 [ERR] http: Request PUT /v1/agent/checks, error: method PUT not allowed from=127.0.0.1:46046
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.512896 [DEBUG] http: Request PUT /v1/agent/checks (949.355µs) from=127.0.0.1:46046
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.516702 [ERR] http: Request POST /v1/agent/checks, error: method POST not allowed from=127.0.0.1:46048
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.517571 [DEBUG] http: Request POST /v1/agent/checks (857.02µs) from=127.0.0.1:46048
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.521464 [ERR] http: Request DELETE /v1/agent/checks, error: method DELETE not allowed from=127.0.0.1:46050
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.522256 [DEBUG] http: Request DELETE /v1/agent/checks (791.018µs) from=127.0.0.1:46050
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.526296 [ERR] http: Request HEAD /v1/agent/checks, error: method HEAD not allowed from=127.0.0.1:46052
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.526524 [DEBUG] http: Request HEAD /v1/agent/checks (246.006µs) from=127.0.0.1:46052
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.528930 [DEBUG] http: Request OPTIONS /v1/agent/checks (17.001µs) from=127.0.0.1:46052
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.531210 [ERR] http: Request GET /v1/agent/connect/ca/leaf/, error: Permission denied from=127.0.0.1:46052
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.532150 [DEBUG] http: Request GET /v1/agent/connect/ca/leaf/ (1.073358ms) from=127.0.0.1:46052
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.535982 [ERR] http: Request PUT /v1/agent/connect/ca/leaf/, error: method PUT not allowed from=127.0.0.1:46054
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.537443 [DEBUG] http: Request PUT /v1/agent/connect/ca/leaf/ (1.4337ms) from=127.0.0.1:46054
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.541851 [ERR] http: Request POST /v1/agent/connect/ca/leaf/, error: method POST not allowed from=127.0.0.1:46056
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.542851 [DEBUG] http: Request POST /v1/agent/connect/ca/leaf/ (991.356µs) from=127.0.0.1:46056
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.546912 [ERR] http: Request DELETE /v1/agent/connect/ca/leaf/, error: method DELETE not allowed from=127.0.0.1:46058
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.547705 [DEBUG] http: Request DELETE /v1/agent/connect/ca/leaf/ (777.684µs) from=127.0.0.1:46058
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.551613 [ERR] http: Request HEAD /v1/agent/connect/ca/leaf/, error: method HEAD not allowed from=127.0.0.1:46060
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.551880 [DEBUG] http: Request HEAD /v1/agent/connect/ca/leaf/ (288.673µs) from=127.0.0.1:46060
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.553890 [DEBUG] http: Request OPTIONS /v1/agent/connect/ca/leaf/ (21.333µs) from=127.0.0.1:46060
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.555886 [DEBUG] agent: dropping node "Node 3d434417-75ce-e300-2e9e-c74634b48698" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.557106 [DEBUG] http: Request GET /v1/agent/members (1.366365ms) from=127.0.0.1:46060
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.561003 [ERR] http: Request PUT /v1/agent/members, error: method PUT not allowed from=127.0.0.1:46062
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.561747 [DEBUG] http: Request PUT /v1/agent/members (749.351µs) from=127.0.0.1:46062
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.565848 [ERR] http: Request POST /v1/agent/members, error: method POST not allowed from=127.0.0.1:46064
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.566598 [DEBUG] http: Request POST /v1/agent/members (753.018µs) from=127.0.0.1:46064
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.570357 [ERR] http: Request DELETE /v1/agent/members, error: method DELETE not allowed from=127.0.0.1:46066
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.570993 [DEBUG] http: Request DELETE /v1/agent/members (624.348µs) from=127.0.0.1:46066
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.574570 [ERR] http: Request HEAD /v1/agent/members, error: method HEAD not allowed from=127.0.0.1:46068
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.574726 [DEBUG] http: Request HEAD /v1/agent/members (170.337µs) from=127.0.0.1:46068
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.576672 [DEBUG] http: Request OPTIONS /v1/agent/members (15.333µs) from=127.0.0.1:46068
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.578474 [ERR] http: Request GET /v1/agent/check/pass/, error: method GET not allowed from=127.0.0.1:46068
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.579424 [DEBUG] http: Request GET /v1/agent/check/pass/ (938.688µs) from=127.0.0.1:46068
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.583532 [ERR] http: Request PUT /v1/agent/check/pass/, error: Unknown check "" from=127.0.0.1:46070
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.584316 [DEBUG] http: Request PUT /v1/agent/check/pass/ (899.688µs) from=127.0.0.1:46070
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.591352 [ERR] http: Request POST /v1/agent/check/pass/, error: method POST not allowed from=127.0.0.1:46072
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.594461 [DEBUG] http: Request POST /v1/agent/check/pass/ (876.687µs) from=127.0.0.1:46072
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.598422 [ERR] http: Request DELETE /v1/agent/check/pass/, error: method DELETE not allowed from=127.0.0.1:46074
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.599312 [DEBUG] http: Request DELETE /v1/agent/check/pass/ (803.685µs) from=127.0.0.1:46074
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.603620 [ERR] http: Request HEAD /v1/agent/check/pass/, error: method HEAD not allowed from=127.0.0.1:46076
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.603771 [DEBUG] http: Request HEAD /v1/agent/check/pass/ (170.337µs) from=127.0.0.1:46076
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.606330 [DEBUG] http: Request OPTIONS /v1/agent/check/pass/ (15µs) from=127.0.0.1:46076
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.608517 [ERR] http: Request GET /v1/agent/connect/authorize, error: method GET not allowed from=127.0.0.1:46076
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.609147 [DEBUG] http: Request GET /v1/agent/connect/authorize (625.681µs) from=127.0.0.1:46076
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.613052 [ERR] http: Request PUT /v1/agent/connect/authorize, error: method PUT not allowed from=127.0.0.1:46078
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.613817 [DEBUG] http: Request PUT /v1/agent/connect/authorize (813.019µs) from=127.0.0.1:46078
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.618513 [ERR] http: Request POST /v1/agent/connect/authorize, error: Bad request: Request decode failed: EOF from=127.0.0.1:46080
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.619286 [DEBUG] http: Request POST /v1/agent/connect/authorize (711.35µs) from=127.0.0.1:46080
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.623338 [ERR] http: Request DELETE /v1/agent/connect/authorize, error: method DELETE not allowed from=127.0.0.1:46082
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.624126 [DEBUG] http: Request DELETE /v1/agent/connect/authorize (782.685µs) from=127.0.0.1:46082
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.627708 [ERR] http: Request HEAD /v1/agent/connect/authorize, error: method HEAD not allowed from=127.0.0.1:46084
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.627943 [DEBUG] http: Request HEAD /v1/agent/connect/authorize (304.34µs) from=127.0.0.1:46084
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.630018 [DEBUG] http: Request OPTIONS /v1/agent/connect/authorize (21µs) from=127.0.0.1:46084
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.631911 [ERR] http: Request GET /v1/acl/bootstrap, error: method GET not allowed from=127.0.0.1:46084
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.632678 [DEBUG] http: Request GET /v1/acl/bootstrap (833.686µs) from=127.0.0.1:46084
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.637815 [DEBUG] http: Request PUT /v1/acl/bootstrap (861.687µs) from=127.0.0.1:46086
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.641926 [ERR] http: Request POST /v1/acl/bootstrap, error: method POST not allowed from=127.0.0.1:46088
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.642681 [DEBUG] http: Request POST /v1/acl/bootstrap (763.018µs) from=127.0.0.1:46088
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.646703 [ERR] http: Request DELETE /v1/acl/bootstrap, error: method DELETE not allowed from=127.0.0.1:46090
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.647540 [DEBUG] http: Request DELETE /v1/acl/bootstrap (837.353µs) from=127.0.0.1:46090
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.651067 [ERR] http: Request HEAD /v1/acl/bootstrap, error: method HEAD not allowed from=127.0.0.1:46092
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.651348 [DEBUG] http: Request HEAD /v1/acl/bootstrap (308.007µs) from=127.0.0.1:46092
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.653290 [DEBUG] http: Request OPTIONS /v1/acl/bootstrap (19µs) from=127.0.0.1:46092
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.655208 [ERR] http: Request GET /v1/acl/login, error: method GET not allowed from=127.0.0.1:46092
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.655940 [DEBUG] http: Request GET /v1/acl/login (730.351µs) from=127.0.0.1:46092
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.660950 [ERR] http: Request PUT /v1/acl/login, error: method PUT not allowed from=127.0.0.1:46094
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.661742 [DEBUG] http: Request PUT /v1/acl/login (794.018µs) from=127.0.0.1:46094
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.665752 [ERR] http: Request POST /v1/acl/login, error: Bad request: Failed to decode request body:: EOF from=127.0.0.1:46096
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.666538 [DEBUG] http: Request POST /v1/acl/login (812.019µs) from=127.0.0.1:46096
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.671163 [ERR] http: Request DELETE /v1/acl/login, error: method DELETE not allowed from=127.0.0.1:46098
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.671949 [DEBUG] http: Request DELETE /v1/acl/login (773.018µs) from=127.0.0.1:46098
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.675929 [ERR] http: Request HEAD /v1/acl/login, error: method HEAD not allowed from=127.0.0.1:46100
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.676084 [DEBUG] http: Request HEAD /v1/acl/login (169.67µs) from=127.0.0.1:46100
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/login
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.677701 [DEBUG] http: Request OPTIONS /v1/acl/login (16µs) from=127.0.0.1:46100
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.679466 [ERR] http: Request GET /v1/agent/join/, error: method GET not allowed from=127.0.0.1:46100
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.680252 [DEBUG] http: Request GET /v1/agent/join/ (1.048358ms) from=127.0.0.1:46100
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.686759 [ERR] http: Request PUT /v1/agent/join/, error: Permission denied from=127.0.0.1:46102
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.687911 [DEBUG] http: Request PUT /v1/agent/join/ (1.388699ms) from=127.0.0.1:46102
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.692439 [ERR] http: Request POST /v1/agent/join/, error: method POST not allowed from=127.0.0.1:46104
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.693128 [DEBUG] http: Request POST /v1/agent/join/ (703.683µs) from=127.0.0.1:46104
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.697184 [ERR] http: Request DELETE /v1/agent/join/, error: method DELETE not allowed from=127.0.0.1:46106
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.697996 [DEBUG] http: Request DELETE /v1/agent/join/ (815.352µs) from=127.0.0.1:46106
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.701712 [ERR] http: Request HEAD /v1/agent/join/, error: method HEAD not allowed from=127.0.0.1:46108
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.701964 [DEBUG] http: Request HEAD /v1/agent/join/ (274.673µs) from=127.0.0.1:46108
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.703906 [DEBUG] http: Request OPTIONS /v1/agent/join/ (31.334µs) from=127.0.0.1:46108
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.707967 [DEBUG] http: Request GET /v1/agent/connect/ca/roots (1.981712ms) from=127.0.0.1:46108
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.713332 [ERR] http: Request PUT /v1/agent/connect/ca/roots, error: method PUT not allowed from=127.0.0.1:46110
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.714099 [DEBUG] http: Request PUT /v1/agent/connect/ca/roots (835.352µs) from=127.0.0.1:46110
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.718790 [ERR] http: Request POST /v1/agent/connect/ca/roots, error: method POST not allowed from=127.0.0.1:46112
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.719583 [DEBUG] http: Request POST /v1/agent/connect/ca/roots (794.685µs) from=127.0.0.1:46112
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.724026 [ERR] http: Request DELETE /v1/agent/connect/ca/roots, error: method DELETE not allowed from=127.0.0.1:46114
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.724907 [DEBUG] http: Request DELETE /v1/agent/connect/ca/roots (899.021µs) from=127.0.0.1:46114
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.729366 [ERR] http: Request HEAD /v1/agent/connect/ca/roots, error: method HEAD not allowed from=127.0.0.1:46116
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.729592 [DEBUG] http: Request HEAD /v1/agent/connect/ca/roots (250.006µs) from=127.0.0.1:46116
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.731950 [DEBUG] http: Request OPTIONS /v1/agent/connect/ca/roots (20µs) from=127.0.0.1:46116
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.735810 [DEBUG] http: Request GET /v1/coordinate/nodes (1.702706ms) from=127.0.0.1:46116
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.742197 [ERR] http: Request PUT /v1/coordinate/nodes, error: method PUT not allowed from=127.0.0.1:46118
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.742939 [DEBUG] http: Request PUT /v1/coordinate/nodes (761.018µs) from=127.0.0.1:46118
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.746852 [ERR] http: Request POST /v1/coordinate/nodes, error: method POST not allowed from=127.0.0.1:46120
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.748033 [DEBUG] http: Request POST /v1/coordinate/nodes (1.185694ms) from=127.0.0.1:46120
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.752081 [ERR] http: Request DELETE /v1/coordinate/nodes, error: method DELETE not allowed from=127.0.0.1:46122
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.752657 [DEBUG] http: Request DELETE /v1/coordinate/nodes (584.013µs) from=127.0.0.1:46122
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.757098 [ERR] http: Request HEAD /v1/coordinate/nodes, error: method HEAD not allowed from=127.0.0.1:46124
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.757287 [DEBUG] http: Request HEAD /v1/coordinate/nodes (169.337µs) from=127.0.0.1:46124
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.764882 [DEBUG] http: Request OPTIONS /v1/coordinate/nodes (19.667µs) from=127.0.0.1:46124
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.768122 [DEBUG] http: Request GET /v1/event/list (1.126026ms) from=127.0.0.1:46124
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.772010 [ERR] http: Request PUT /v1/event/list, error: method PUT not allowed from=127.0.0.1:46126
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.773474 [DEBUG] http: Request PUT /v1/event/list (1.404699ms) from=127.0.0.1:46126
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.777907 [ERR] http: Request POST /v1/event/list, error: method POST not allowed from=127.0.0.1:46128
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.778713 [DEBUG] http: Request POST /v1/event/list (844.02µs) from=127.0.0.1:46128
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.782811 [ERR] http: Request DELETE /v1/event/list, error: method DELETE not allowed from=127.0.0.1:46130
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.783551 [DEBUG] http: Request DELETE /v1/event/list (756.351µs) from=127.0.0.1:46130
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.787884 [ERR] http: Request HEAD /v1/event/list, error: method HEAD not allowed from=127.0.0.1:46132
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.788190 [DEBUG] http: Request HEAD /v1/event/list (330.341µs) from=127.0.0.1:46132
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.790346 [DEBUG] http: Request OPTIONS /v1/event/list (18.334µs) from=127.0.0.1:46132
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.792197 [ERR] http: Request GET /v1/acl/clone/, error: method GET not allowed from=127.0.0.1:46132
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.792957 [DEBUG] http: Request GET /v1/acl/clone/ (756.684µs) from=127.0.0.1:46132
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.796994 [DEBUG] http: Request PUT /v1/acl/clone/ (502.012µs) from=127.0.0.1:46134
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.801359 [ERR] http: Request POST /v1/acl/clone/, error: method POST not allowed from=127.0.0.1:46136
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.802251 [DEBUG] http: Request POST /v1/acl/clone/ (880.354µs) from=127.0.0.1:46136
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.806269 [ERR] http: Request DELETE /v1/acl/clone/, error: method DELETE not allowed from=127.0.0.1:46138
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.807159 [DEBUG] http: Request DELETE /v1/acl/clone/ (814.686µs) from=127.0.0.1:46138
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.811886 [ERR] http: Request HEAD /v1/acl/clone/, error: method HEAD not allowed from=127.0.0.1:46140
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.812176 [DEBUG] http: Request HEAD /v1/acl/clone/ (312.007µs) from=127.0.0.1:46140
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.814310 [DEBUG] http: Request OPTIONS /v1/acl/clone/ (21.668µs) from=127.0.0.1:46140
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.817037 [ERR] http: Request GET /v1/agent/health/service/id/, error: Bad request: Missing serviceID from=127.0.0.1:46140
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.817815 [DEBUG] http: Request GET /v1/agent/health/service/id/ (779.685µs) from=127.0.0.1:46140
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.821403 [ERR] http: Request PUT /v1/agent/health/service/id/, error: method PUT not allowed from=127.0.0.1:46142
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.822218 [DEBUG] http: Request PUT /v1/agent/health/service/id/ (868.353µs) from=127.0.0.1:46142
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.870242 [ERR] http: Request POST /v1/agent/health/service/id/, error: method POST not allowed from=127.0.0.1:46144
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.871222 [DEBUG] http: Request POST /v1/agent/health/service/id/ (993.69µs) from=127.0.0.1:46144
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.879531 [ERR] http: Request DELETE /v1/agent/health/service/id/, error: method DELETE not allowed from=127.0.0.1:46146
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.883614 [DEBUG] http: Request DELETE /v1/agent/health/service/id/ (4.078428ms) from=127.0.0.1:46146
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.895442 [ERR] http: Request HEAD /v1/agent/health/service/id/, error: method HEAD not allowed from=127.0.0.1:46148
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.895775 [DEBUG] http: Request HEAD /v1/agent/health/service/id/ (346.675µs) from=127.0.0.1:46148
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.897952 [DEBUG] http: Request OPTIONS /v1/agent/health/service/id/ (18.667µs) from=127.0.0.1:46148
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.900436 [ERR] http: Request GET /v1/event/fire/, error: method GET not allowed from=127.0.0.1:46148
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.901258 [DEBUG] http: Request GET /v1/event/fire/ (834.686µs) from=127.0.0.1:46148
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.905653 [DEBUG] http: Request PUT /v1/event/fire/ (708.683µs) from=127.0.0.1:46150
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.909873 [ERR] http: Request POST /v1/event/fire/, error: method POST not allowed from=127.0.0.1:46152
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.910899 [DEBUG] http: Request POST /v1/event/fire/ (1.026357ms) from=127.0.0.1:46152
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.914721 [ERR] http: Request DELETE /v1/event/fire/, error: method DELETE not allowed from=127.0.0.1:46154
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.915665 [DEBUG] http: Request DELETE /v1/event/fire/ (955.022µs) from=127.0.0.1:46154
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.919165 [ERR] http: Request HEAD /v1/event/fire/, error: method HEAD not allowed from=127.0.0.1:46156
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.919517 [DEBUG] http: Request HEAD /v1/event/fire/ (374.342µs) from=127.0.0.1:46156
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.921376 [DEBUG] http: Request OPTIONS /v1/event/fire/ (18.334µs) from=127.0.0.1:46156
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.924665 [DEBUG] http: Request GET /v1/session/list (1.509702ms) from=127.0.0.1:46156
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.937020 [ERR] http: Request PUT /v1/session/list, error: method PUT not allowed from=127.0.0.1:46158
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.938387 [DEBUG] http: Request PUT /v1/session/list (1.220361ms) from=127.0.0.1:46158
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.942493 [ERR] http: Request POST /v1/session/list, error: method POST not allowed from=127.0.0.1:46160
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.943216 [DEBUG] http: Request POST /v1/session/list (729.684µs) from=127.0.0.1:46160
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.946763 [ERR] http: Request DELETE /v1/session/list, error: method DELETE not allowed from=127.0.0.1:46162
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.947558 [DEBUG] http: Request DELETE /v1/session/list (800.018µs) from=127.0.0.1:46162
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.950967 [ERR] http: Request HEAD /v1/session/list, error: method HEAD not allowed from=127.0.0.1:46164
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.951125 [DEBUG] http: Request HEAD /v1/session/list (175.671µs) from=127.0.0.1:46164
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.952859 [DEBUG] http: Request OPTIONS /v1/session/list (18.667µs) from=127.0.0.1:46164
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.955107 [ERR] http: Request GET /v1/acl/binding-rules, error: Permission denied from=127.0.0.1:46164
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.955876 [DEBUG] http: Request GET /v1/acl/binding-rules (1.237696ms) from=127.0.0.1:46164
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.959409 [ERR] http: Request PUT /v1/acl/binding-rules, error: method PUT not allowed from=127.0.0.1:46166
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.960849 [DEBUG] http: Request PUT /v1/acl/binding-rules (1.462701ms) from=127.0.0.1:46166
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.963914 [ERR] http: Request POST /v1/acl/binding-rules, error: method POST not allowed from=127.0.0.1:46168
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.964921 [DEBUG] http: Request POST /v1/acl/binding-rules (1.02169ms) from=127.0.0.1:46168
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.968775 [ERR] http: Request DELETE /v1/acl/binding-rules, error: method DELETE not allowed from=127.0.0.1:46170
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.969575 [DEBUG] http: Request DELETE /v1/acl/binding-rules (793.019µs) from=127.0.0.1:46170
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.972760 [ERR] http: Request HEAD /v1/acl/binding-rules, error: method HEAD not allowed from=127.0.0.1:46172
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.973104 [DEBUG] http: Request HEAD /v1/acl/binding-rules (385.342µs) from=127.0.0.1:46172
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rules
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.975130 [DEBUG] http: Request OPTIONS /v1/acl/binding-rules (18.333µs) from=127.0.0.1:46172
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.976987 [ERR] http: Request GET /v1/agent/service/maintenance/, error: method GET not allowed from=127.0.0.1:46172
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.977875 [DEBUG] http: Request GET /v1/agent/service/maintenance/ (881.354µs) from=127.0.0.1:46172
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.982024 [DEBUG] http: Request PUT /v1/agent/service/maintenance/ (538.346µs) from=127.0.0.1:46174
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.987473 [ERR] http: Request POST /v1/agent/service/maintenance/, error: method POST not allowed from=127.0.0.1:46176
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.988201 [DEBUG] http: Request POST /v1/agent/service/maintenance/ (747.684µs) from=127.0.0.1:46176
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.991561 [ERR] http: Request DELETE /v1/agent/service/maintenance/, error: method DELETE not allowed from=127.0.0.1:46178
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.992349 [DEBUG] http: Request DELETE /v1/agent/service/maintenance/ (808.019µs) from=127.0.0.1:46178
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.995907 [ERR] http: Request HEAD /v1/agent/service/maintenance/, error: method HEAD not allowed from=127.0.0.1:46180
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.996063 [DEBUG] http: Request HEAD /v1/agent/service/maintenance/ (228.005µs) from=127.0.0.1:46180
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:01:59.998263 [DEBUG] http: Request OPTIONS /v1/agent/service/maintenance/ (20.334µs) from=127.0.0.1:46180
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.000827 [DEBUG] http: Request GET /v1/health/checks/ (638.015µs) from=127.0.0.1:46180
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.005024 [ERR] http: Request PUT /v1/health/checks/, error: method PUT not allowed from=127.0.0.1:46182
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.005686 [DEBUG] http: Request PUT /v1/health/checks/ (689.016µs) from=127.0.0.1:46182
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.008976 [ERR] http: Request POST /v1/health/checks/, error: method POST not allowed from=127.0.0.1:46184
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.009921 [DEBUG] http: Request POST /v1/health/checks/ (961.356µs) from=127.0.0.1:46184
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.013992 [ERR] http: Request DELETE /v1/health/checks/, error: method DELETE not allowed from=127.0.0.1:46186
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.014943 [DEBUG] http: Request DELETE /v1/health/checks/ (954.022µs) from=127.0.0.1:46186
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.018462 [ERR] http: Request HEAD /v1/health/checks/, error: method HEAD not allowed from=127.0.0.1:46188
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.018813 [DEBUG] http: Request HEAD /v1/health/checks/ (370.675µs) from=127.0.0.1:46188
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.020584 [DEBUG] http: Request OPTIONS /v1/health/checks/ (17.001µs) from=127.0.0.1:46188
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.022752 [DEBUG] http: Request GET /v1/internal/ui/node/ (578.68µs) from=127.0.0.1:46188
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.029745 [ERR] http: Request PUT /v1/internal/ui/node/, error: method PUT not allowed from=127.0.0.1:46190
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.031040 [DEBUG] http: Request PUT /v1/internal/ui/node/ (1.240696ms) from=127.0.0.1:46190
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.034680 [ERR] http: Request POST /v1/internal/ui/node/, error: method POST not allowed from=127.0.0.1:46192
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.035583 [DEBUG] http: Request POST /v1/internal/ui/node/ (925.021µs) from=127.0.0.1:46192
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.040629 [ERR] http: Request DELETE /v1/internal/ui/node/, error: method DELETE not allowed from=127.0.0.1:46194
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.041357 [DEBUG] http: Request DELETE /v1/internal/ui/node/ (724.35µs) from=127.0.0.1:46194
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.044841 [ERR] http: Request HEAD /v1/internal/ui/node/, error: method HEAD not allowed from=127.0.0.1:46196
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.044993 [DEBUG] http: Request HEAD /v1/internal/ui/node/ (172.004µs) from=127.0.0.1:46196
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.047443 [DEBUG] http: Request OPTIONS /v1/internal/ui/node/ (19.667µs) from=127.0.0.1:46196
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.049952 [DEBUG] http: Request GET /v1/acl/info/ (831.352µs) from=127.0.0.1:46196
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.053320 [ERR] http: Request PUT /v1/acl/info/, error: method PUT not allowed from=127.0.0.1:46198
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.054017 [DEBUG] http: Request PUT /v1/acl/info/ (757.017µs) from=127.0.0.1:46198
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.057827 [ERR] http: Request POST /v1/acl/info/, error: method POST not allowed from=127.0.0.1:46200
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.058879 [DEBUG] http: Request POST /v1/acl/info/ (1.043024ms) from=127.0.0.1:46200
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.063982 [ERR] http: Request DELETE /v1/acl/info/, error: method DELETE not allowed from=127.0.0.1:46202
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.064727 [DEBUG] http: Request DELETE /v1/acl/info/ (736.684µs) from=127.0.0.1:46202
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.068530 [ERR] http: Request HEAD /v1/acl/info/, error: method HEAD not allowed from=127.0.0.1:46204
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.068894 [DEBUG] http: Request HEAD /v1/acl/info/ (387.676µs) from=127.0.0.1:46204
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.070869 [DEBUG] http: Request OPTIONS /v1/acl/info/ (15.001µs) from=127.0.0.1:46204
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.073376 [DEBUG] http: Request GET /v1/catalog/service/ (586.68µs) from=127.0.0.1:46204
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.077224 [ERR] http: Request PUT /v1/catalog/service/, error: method PUT not allowed from=127.0.0.1:46206
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.078048 [DEBUG] http: Request PUT /v1/catalog/service/ (825.352µs) from=127.0.0.1:46206
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.081996 [ERR] http: Request POST /v1/catalog/service/, error: method POST not allowed from=127.0.0.1:46208
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.082841 [DEBUG] http: Request POST /v1/catalog/service/ (850.019µs) from=127.0.0.1:46208
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.086902 [ERR] http: Request DELETE /v1/catalog/service/, error: method DELETE not allowed from=127.0.0.1:46210
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.087896 [DEBUG] http: Request DELETE /v1/catalog/service/ (992.023µs) from=127.0.0.1:46210
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.091938 [ERR] http: Request HEAD /v1/catalog/service/, error: method HEAD not allowed from=127.0.0.1:46212
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.092244 [DEBUG] http: Request HEAD /v1/catalog/service/ (341.008µs) from=127.0.0.1:46212
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.094056 [DEBUG] http: Request OPTIONS /v1/catalog/service/ (16.667µs) from=127.0.0.1:46212
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.096398 [ERR] http: Request GET /v1/operator/autopilot/health, error: Permission denied from=127.0.0.1:46212
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.097278 [DEBUG] http: Request GET /v1/operator/autopilot/health (1.349032ms) from=127.0.0.1:46212
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.103183 [ERR] http: Request PUT /v1/operator/autopilot/health, error: method PUT not allowed from=127.0.0.1:46214
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.103979 [DEBUG] http: Request PUT /v1/operator/autopilot/health (799.686µs) from=127.0.0.1:46214
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.107880 [ERR] http: Request POST /v1/operator/autopilot/health, error: method POST not allowed from=127.0.0.1:46216
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.108901 [DEBUG] http: Request POST /v1/operator/autopilot/health (1.01669ms) from=127.0.0.1:46216
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.112917 [ERR] http: Request DELETE /v1/operator/autopilot/health, error: method DELETE not allowed from=127.0.0.1:46218
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.113761 [DEBUG] http: Request DELETE /v1/operator/autopilot/health (828.686µs) from=127.0.0.1:46218
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.117580 [ERR] http: Request HEAD /v1/operator/autopilot/health, error: method HEAD not allowed from=127.0.0.1:46220
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.117740 [DEBUG] http: Request HEAD /v1/operator/autopilot/health (182.671µs) from=127.0.0.1:46220
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.119745 [DEBUG] http: Request OPTIONS /v1/operator/autopilot/health (19µs) from=127.0.0.1:46220
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.121694 [ERR] http: Request GET /v1/session/destroy/, error: method GET not allowed from=127.0.0.1:46220
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.122509 [DEBUG] http: Request GET /v1/session/destroy/ (828.019µs) from=127.0.0.1:46220
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.127056 [DEBUG] http: Request PUT /v1/session/destroy/ (620.348µs) from=127.0.0.1:46222
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.130659 [ERR] http: Request POST /v1/session/destroy/, error: method POST not allowed from=127.0.0.1:46224
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.131463 [DEBUG] http: Request POST /v1/session/destroy/ (823.019µs) from=127.0.0.1:46224
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.135691 [ERR] http: Request DELETE /v1/session/destroy/, error: method DELETE not allowed from=127.0.0.1:46226
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.136541 [DEBUG] http: Request DELETE /v1/session/destroy/ (860.687µs) from=127.0.0.1:46226
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.140935 [ERR] http: Request HEAD /v1/session/destroy/, error: method HEAD not allowed from=127.0.0.1:46228
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.141303 [DEBUG] http: Request HEAD /v1/session/destroy/ (544.346µs) from=127.0.0.1:46228
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.143647 [DEBUG] http: Request OPTIONS /v1/session/destroy/ (19.334µs) from=127.0.0.1:46228
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.146038 [DEBUG] http: Request GET /v1/session/node/ (572.014µs) from=127.0.0.1:46228
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.149927 [ERR] http: Request PUT /v1/session/node/, error: method PUT not allowed from=127.0.0.1:46230
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.150839 [DEBUG] http: Request PUT /v1/session/node/ (904.354µs) from=127.0.0.1:46230
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.154251 [ERR] http: Request POST /v1/session/node/, error: method POST not allowed from=127.0.0.1:46232
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.155059 [DEBUG] http: Request POST /v1/session/node/ (866.687µs) from=127.0.0.1:46232
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.159338 [ERR] http: Request DELETE /v1/session/node/, error: method DELETE not allowed from=127.0.0.1:46234
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.160308 [DEBUG] http: Request DELETE /v1/session/node/ (970.356µs) from=127.0.0.1:46234
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.164537 [ERR] http: Request HEAD /v1/session/node/, error: method HEAD not allowed from=127.0.0.1:46236
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.164823 [DEBUG] http: Request HEAD /v1/session/node/ (359.341µs) from=127.0.0.1:46236
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.166575 [DEBUG] http: Request OPTIONS /v1/session/node/ (89.002µs) from=127.0.0.1:46236
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.169924 [DEBUG] http: Request GET /v1/acl/replication (1.290363ms) from=127.0.0.1:46236
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.173336 [ERR] http: Request PUT /v1/acl/replication, error: method PUT not allowed from=127.0.0.1:46238
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.174017 [DEBUG] http: Request PUT /v1/acl/replication (700.683µs) from=127.0.0.1:46238
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.177681 [ERR] http: Request POST /v1/acl/replication, error: method POST not allowed from=127.0.0.1:46240
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.178379 [DEBUG] http: Request POST /v1/acl/replication (716.684µs) from=127.0.0.1:46240
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.181852 [ERR] http: Request DELETE /v1/acl/replication, error: method DELETE not allowed from=127.0.0.1:46242
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.182582 [DEBUG] http: Request DELETE /v1/acl/replication (727.017µs) from=127.0.0.1:46242
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.190639 [ERR] http: Request HEAD /v1/acl/replication, error: method HEAD not allowed from=127.0.0.1:46244
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.190861 [DEBUG] http: Request HEAD /v1/acl/replication (248.006µs) from=127.0.0.1:46244
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.193248 [DEBUG] http: Request OPTIONS /v1/acl/replication (20.667µs) from=127.0.0.1:46244
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.196855 [DEBUG] consul: dropping node "Node 3d434417-75ce-e300-2e9e-c74634b48698" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.198305 [DEBUG] http: Request GET /v1/internal/ui/services (2.241718ms) from=127.0.0.1:46244
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.202817 [ERR] http: Request PUT /v1/internal/ui/services, error: method PUT not allowed from=127.0.0.1:46246
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.204714 [DEBUG] http: Request PUT /v1/internal/ui/services (1.873377ms) from=127.0.0.1:46246
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.209158 [ERR] http: Request POST /v1/internal/ui/services, error: method POST not allowed from=127.0.0.1:46248
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.209889 [DEBUG] http: Request POST /v1/internal/ui/services (743.017µs) from=127.0.0.1:46248
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.213402 [ERR] http: Request DELETE /v1/internal/ui/services, error: method DELETE not allowed from=127.0.0.1:46250
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.213964 [DEBUG] http: Request DELETE /v1/internal/ui/services (578.68µs) from=127.0.0.1:46250
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.217654 [ERR] http: Request HEAD /v1/internal/ui/services, error: method HEAD not allowed from=127.0.0.1:46252
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.217800 [DEBUG] http: Request HEAD /v1/internal/ui/services (159.004µs) from=127.0.0.1:46252
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.220281 [DEBUG] http: Request OPTIONS /v1/internal/ui/services (15.334µs) from=127.0.0.1:46252
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.222527 [DEBUG] http: Request GET /v1/catalog/connect/ (466.677µs) from=127.0.0.1:46252
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.225515 [ERR] http: Request PUT /v1/catalog/connect/, error: method PUT not allowed from=127.0.0.1:46254
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.226084 [DEBUG] http: Request PUT /v1/catalog/connect/ (583.347µs) from=127.0.0.1:46254
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.229864 [ERR] http: Request POST /v1/catalog/connect/, error: method POST not allowed from=127.0.0.1:46256
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.230512 [DEBUG] http: Request POST /v1/catalog/connect/ (644.015µs) from=127.0.0.1:46256
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.233606 [ERR] http: Request DELETE /v1/catalog/connect/, error: method DELETE not allowed from=127.0.0.1:46258
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.234106 [DEBUG] http: Request DELETE /v1/catalog/connect/ (507.679µs) from=127.0.0.1:46258
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.237146 [ERR] http: Request HEAD /v1/catalog/connect/, error: method HEAD not allowed from=127.0.0.1:46260
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.237415 [DEBUG] http: Request HEAD /v1/catalog/connect/ (285.007µs) from=127.0.0.1:46260
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.238957 [DEBUG] http: Request OPTIONS /v1/catalog/connect/ (16.667µs) from=127.0.0.1:46260
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.240598 [ERR] http: Request GET /v1/acl/policy, error: method GET not allowed from=127.0.0.1:46260
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.241320 [DEBUG] http: Request GET /v1/acl/policy (724.35µs) from=127.0.0.1:46260
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.244703 [ERR] http: Request PUT /v1/acl/policy, error: Bad request: Policy decoding failed: EOF from=127.0.0.1:46262
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.245912 [DEBUG] http: Request PUT /v1/acl/policy (1.27403ms) from=127.0.0.1:46262
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.249328 [ERR] http: Request POST /v1/acl/policy, error: method POST not allowed from=127.0.0.1:46264
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.249975 [DEBUG] http: Request POST /v1/acl/policy (693.016µs) from=127.0.0.1:46264
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.253411 [ERR] http: Request DELETE /v1/acl/policy, error: method DELETE not allowed from=127.0.0.1:46266
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.254099 [DEBUG] http: Request DELETE /v1/acl/policy (683.683µs) from=127.0.0.1:46266
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.257637 [ERR] http: Request HEAD /v1/acl/policy, error: method HEAD not allowed from=127.0.0.1:46268
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.257841 [DEBUG] http: Request HEAD /v1/acl/policy (278.673µs) from=127.0.0.1:46268
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.259773 [DEBUG] http: Request OPTIONS /v1/acl/policy (17.334µs) from=127.0.0.1:46268
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.261603 [ERR] http: Request GET /v1/acl/token/, error: Bad request: Missing token ID from=127.0.0.1:46268
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.262144 [DEBUG] http: Request GET /v1/acl/token/ (559.346µs) from=127.0.0.1:46268
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.265810 [ERR] http: Request PUT /v1/acl/token/, error: Bad request: Token decoding failed: EOF from=127.0.0.1:46270
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.266507 [DEBUG] http: Request PUT /v1/acl/token/ (754.351µs) from=127.0.0.1:46270
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.271273 [ERR] http: Request POST /v1/acl/token/, error: method POST not allowed from=127.0.0.1:46272
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.271996 [DEBUG] http: Request POST /v1/acl/token/ (717.35µs) from=127.0.0.1:46272
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.275424 [ERR] http: Request DELETE /v1/acl/token/, error: Bad request: Missing token ID from=127.0.0.1:46274
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.275952 [DEBUG] http: Request DELETE /v1/acl/token/ (539.012µs) from=127.0.0.1:46274
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.278810 [ERR] http: Request HEAD /v1/acl/token/, error: method HEAD not allowed from=127.0.0.1:46276
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.278953 [DEBUG] http: Request HEAD /v1/acl/token/ (158.337µs) from=127.0.0.1:46276
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.280456 [DEBUG] http: Request OPTIONS /v1/acl/token/ (17.334µs) from=127.0.0.1:46276
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.282062 [ERR] http: Request GET /v1/agent/check/warn/, error: method GET not allowed from=127.0.0.1:46276
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.282732 [DEBUG] http: Request GET /v1/agent/check/warn/ (665.682µs) from=127.0.0.1:46276
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.286074 [ERR] http: Request PUT /v1/agent/check/warn/, error: Unknown check "" from=127.0.0.1:46278
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.286562 [DEBUG] http: Request PUT /v1/agent/check/warn/ (629.015µs) from=127.0.0.1:46278
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.289837 [ERR] http: Request POST /v1/agent/check/warn/, error: method POST not allowed from=127.0.0.1:46280
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.290514 [DEBUG] http: Request POST /v1/agent/check/warn/ (677.349µs) from=127.0.0.1:46280
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.293500 [ERR] http: Request DELETE /v1/agent/check/warn/, error: method DELETE not allowed from=127.0.0.1:46282
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.294166 [DEBUG] http: Request DELETE /v1/agent/check/warn/ (670.682µs) from=127.0.0.1:46282
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.297977 [ERR] http: Request HEAD /v1/agent/check/warn/, error: method HEAD not allowed from=127.0.0.1:46284
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.298199 [DEBUG] http: Request HEAD /v1/agent/check/warn/ (238.005µs) from=127.0.0.1:46284
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.300017 [DEBUG] http: Request OPTIONS /v1/agent/check/warn/ (14µs) from=127.0.0.1:46284
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.301832 [ERR] http: Request GET /v1/agent/service/deregister/, error: method GET not allowed from=127.0.0.1:46284
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.302486 [DEBUG] http: Request GET /v1/agent/service/deregister/ (646.014µs) from=127.0.0.1:46284
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.305752 [ERR] http: Request PUT /v1/agent/service/deregister/, error: Unknown service "" from=127.0.0.1:46286
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.306268 [DEBUG] http: Request PUT /v1/agent/service/deregister/ (669.682µs) from=127.0.0.1:46286
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.309815 [ERR] http: Request POST /v1/agent/service/deregister/, error: method POST not allowed from=127.0.0.1:46288
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.310343 [DEBUG] http: Request POST /v1/agent/service/deregister/ (533.346µs) from=127.0.0.1:46288
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.333056 [ERR] http: Request DELETE /v1/agent/service/deregister/, error: method DELETE not allowed from=127.0.0.1:46290
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.333826 [DEBUG] http: Request DELETE /v1/agent/service/deregister/ (760.685µs) from=127.0.0.1:46290
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.337214 [ERR] http: Request HEAD /v1/agent/service/deregister/, error: method HEAD not allowed from=127.0.0.1:46292
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.337374 [DEBUG] http: Request HEAD /v1/agent/service/deregister/ (181.004µs) from=127.0.0.1:46292
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.338967 [DEBUG] http: Request OPTIONS /v1/agent/service/deregister/ (16.334µs) from=127.0.0.1:46292
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.340607 [ERR] http: Request GET /v1/catalog/register, error: method GET not allowed from=127.0.0.1:46292
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.341162 [DEBUG] http: Request GET /v1/catalog/register (555.013µs) from=127.0.0.1:46292
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.346456 [DEBUG] http: Request PUT /v1/catalog/register (479.344µs) from=127.0.0.1:46294
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.350778 [ERR] http: Request POST /v1/catalog/register, error: method POST not allowed from=127.0.0.1:46296
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.351371 [DEBUG] http: Request POST /v1/catalog/register (598.68µs) from=127.0.0.1:46296
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.354823 [ERR] http: Request DELETE /v1/catalog/register, error: method DELETE not allowed from=127.0.0.1:46298
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.355376 [DEBUG] http: Request DELETE /v1/catalog/register (561.68µs) from=127.0.0.1:46298
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.358469 [ERR] http: Request HEAD /v1/catalog/register, error: method HEAD not allowed from=127.0.0.1:46300
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.358614 [DEBUG] http: Request HEAD /v1/catalog/register (155.004µs) from=127.0.0.1:46300
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.360388 [DEBUG] http: Request OPTIONS /v1/catalog/register (16µs) from=127.0.0.1:46300
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.362215 [DEBUG] http: Request GET /v1/health/connect/ (470.677µs) from=127.0.0.1:46300
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.365467 [ERR] http: Request PUT /v1/health/connect/, error: method PUT not allowed from=127.0.0.1:46302
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.366177 [DEBUG] http: Request PUT /v1/health/connect/ (711.016µs) from=127.0.0.1:46302
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.369462 [ERR] http: Request POST /v1/health/connect/, error: method POST not allowed from=127.0.0.1:46304
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.369990 [DEBUG] http: Request POST /v1/health/connect/ (540.012µs) from=127.0.0.1:46304
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.372833 [ERR] http: Request DELETE /v1/health/connect/, error: method DELETE not allowed from=127.0.0.1:46306
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.373574 [DEBUG] http: Request DELETE /v1/health/connect/ (740.017µs) from=127.0.0.1:46306
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.376979 [ERR] http: Request HEAD /v1/health/connect/, error: method HEAD not allowed from=127.0.0.1:46308
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.377119 [DEBUG] http: Request HEAD /v1/health/connect/ (158.003µs) from=127.0.0.1:46308
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.378858 [DEBUG] http: Request OPTIONS /v1/health/connect/ (16µs) from=127.0.0.1:46308
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.380895 [DEBUG] consul: dropping node "Node 3d434417-75ce-e300-2e9e-c74634b48698" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.384718 [DEBUG] http: Request GET /v1/internal/ui/nodes (4.326767ms) from=127.0.0.1:46308
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.389303 [ERR] http: Request PUT /v1/internal/ui/nodes, error: method PUT not allowed from=127.0.0.1:46310
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.390045 [DEBUG] http: Request PUT /v1/internal/ui/nodes (748.684µs) from=127.0.0.1:46310
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.394031 [ERR] http: Request POST /v1/internal/ui/nodes, error: method POST not allowed from=127.0.0.1:46312
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.394793 [DEBUG] http: Request POST /v1/internal/ui/nodes (746.017µs) from=127.0.0.1:46312
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.398866 [ERR] http: Request DELETE /v1/internal/ui/nodes, error: method DELETE not allowed from=127.0.0.1:46314
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.399709 [DEBUG] http: Request DELETE /v1/internal/ui/nodes (838.686µs) from=127.0.0.1:46314
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.403477 [ERR] http: Request HEAD /v1/internal/ui/nodes, error: method HEAD not allowed from=127.0.0.1:46316
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.403621 [DEBUG] http: Request HEAD /v1/internal/ui/nodes (165.003µs) from=127.0.0.1:46316
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.405649 [DEBUG] http: Request OPTIONS /v1/internal/ui/nodes (16µs) from=127.0.0.1:46316
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.407305 [ERR] http: Request GET /v1/session/create, error: method GET not allowed from=127.0.0.1:46316
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.407861 [DEBUG] http: Request GET /v1/session/create (562.013µs) from=127.0.0.1:46316
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.413348 [ERR] http: Request PUT /v1/session/create, error: Permission denied from=127.0.0.1:46318
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.413910 [DEBUG] http: Request PUT /v1/session/create (991.689µs) from=127.0.0.1:46318
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.416837 [ERR] http: Request POST /v1/session/create, error: method POST not allowed from=127.0.0.1:46320
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.417457 [DEBUG] http: Request POST /v1/session/create (667.682µs) from=127.0.0.1:46320
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.420363 [ERR] http: Request DELETE /v1/session/create, error: method DELETE not allowed from=127.0.0.1:46322
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.420986 [DEBUG] http: Request DELETE /v1/session/create (634.015µs) from=127.0.0.1:46322
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.424506 [ERR] http: Request HEAD /v1/session/create, error: method HEAD not allowed from=127.0.0.1:46324
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.424660 [DEBUG] http: Request HEAD /v1/session/create (169.67µs) from=127.0.0.1:46324
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.426254 [DEBUG] http: Request OPTIONS /v1/session/create (17µs) from=127.0.0.1:46324
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.427675 [ERR] http: Request GET /v1/agent/check/deregister/, error: method GET not allowed from=127.0.0.1:46324
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.428275 [DEBUG] http: Request GET /v1/agent/check/deregister/ (607.681µs) from=127.0.0.1:46324
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.432384 [ERR] http: Request PUT /v1/agent/check/deregister/, error: Unknown check "" from=127.0.0.1:46326
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.433002 [DEBUG] http: Request PUT /v1/agent/check/deregister/ (781.351µs) from=127.0.0.1:46326
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.436902 [ERR] http: Request POST /v1/agent/check/deregister/, error: method POST not allowed from=127.0.0.1:46328
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.437487 [DEBUG] http: Request POST /v1/agent/check/deregister/ (587.68µs) from=127.0.0.1:46328
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.440465 [ERR] http: Request DELETE /v1/agent/check/deregister/, error: method DELETE not allowed from=127.0.0.1:46330
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.441083 [DEBUG] http: Request DELETE /v1/agent/check/deregister/ (628.347µs) from=127.0.0.1:46330
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.448079 [ERR] http: Request HEAD /v1/agent/check/deregister/, error: method HEAD not allowed from=127.0.0.1:46332
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.448280 [DEBUG] http: Request HEAD /v1/agent/check/deregister/ (229.339µs) from=127.0.0.1:46332
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.450737 [DEBUG] http: Request OPTIONS /v1/agent/check/deregister/ (20.334µs) from=127.0.0.1:46332
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.453422 [DEBUG] consul: dropping service "consul" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.455176 [DEBUG] http: Request GET /v1/catalog/services (2.442724ms) from=127.0.0.1:46332
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.463451 [ERR] http: Request PUT /v1/catalog/services, error: method PUT not allowed from=127.0.0.1:46334
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.466261 [DEBUG] http: Request PUT /v1/catalog/services (2.810065ms) from=127.0.0.1:46334
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.470513 [ERR] http: Request POST /v1/catalog/services, error: method POST not allowed from=127.0.0.1:46336
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.472106 [DEBUG] http: Request POST /v1/catalog/services (1.536369ms) from=127.0.0.1:46336
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.477403 [ERR] http: Request DELETE /v1/catalog/services, error: method DELETE not allowed from=127.0.0.1:46338
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.479101 [DEBUG] http: Request DELETE /v1/catalog/services (1.776708ms) from=127.0.0.1:46338
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.483775 [ERR] http: Request HEAD /v1/catalog/services, error: method HEAD not allowed from=127.0.0.1:46340
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.484168 [DEBUG] http: Request HEAD /v1/catalog/services (501.345µs) from=127.0.0.1:46340
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.487090 [DEBUG] http: Request OPTIONS /v1/catalog/services (20.001µs) from=127.0.0.1:46340
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.489250 [ERR] http: Request GET /v1/session/renew/, error: method GET not allowed from=127.0.0.1:46340
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.490075 [DEBUG] http: Request GET /v1/session/renew/ (814.353µs) from=127.0.0.1:46340
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.498037 [DEBUG] http: Request PUT /v1/session/renew/ (1.350698ms) from=127.0.0.1:46342
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.502426 [ERR] http: Request POST /v1/session/renew/, error: method POST not allowed from=127.0.0.1:46344
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.504071 [DEBUG] http: Request POST /v1/session/renew/ (1.654705ms) from=127.0.0.1:46344
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.508624 [ERR] http: Request DELETE /v1/session/renew/, error: method DELETE not allowed from=127.0.0.1:46346
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.510700 [DEBUG] http: Request DELETE /v1/session/renew/ (2.075381ms) from=127.0.0.1:46346
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.515692 [ERR] http: Request HEAD /v1/session/renew/, error: method HEAD not allowed from=127.0.0.1:46348
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.515882 [DEBUG] http: Request HEAD /v1/session/renew/ (216.338µs) from=127.0.0.1:46348
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.519285 [DEBUG] http: Request OPTIONS /v1/session/renew/ (21.334µs) from=127.0.0.1:46348
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.520920 [ERR] http: Request GET /v1/acl/update, error: method GET not allowed from=127.0.0.1:46348
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.521616 [DEBUG] http: Request GET /v1/acl/update (695.016µs) from=127.0.0.1:46348
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.526624 [DEBUG] http: Request PUT /v1/acl/update (561.68µs) from=127.0.0.1:46350
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.534896 [ERR] http: Request POST /v1/acl/update, error: method POST not allowed from=127.0.0.1:46352
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.535408 [DEBUG] http: Request POST /v1/acl/update (514.678µs) from=127.0.0.1:46352
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.539505 [ERR] http: Request DELETE /v1/acl/update, error: method DELETE not allowed from=127.0.0.1:46354
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.540330 [DEBUG] http: Request DELETE /v1/acl/update (799.685µs) from=127.0.0.1:46354
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.543485 [ERR] http: Request HEAD /v1/acl/update, error: method HEAD not allowed from=127.0.0.1:46356
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.543650 [DEBUG] http: Request HEAD /v1/acl/update (186.671µs) from=127.0.0.1:46356
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.545057 [DEBUG] http: Request OPTIONS /v1/acl/update (15.334µs) from=127.0.0.1:46356
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.546593 [ERR] http: Request GET /v1/agent/self, error: Permission denied from=127.0.0.1:46356
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.547014 [DEBUG] http: Request GET /v1/agent/self (545.013µs) from=127.0.0.1:46356
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.549800 [ERR] http: Request PUT /v1/agent/self, error: method PUT not allowed from=127.0.0.1:46358
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.550251 [DEBUG] http: Request PUT /v1/agent/self (459.677µs) from=127.0.0.1:46358
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.553130 [ERR] http: Request POST /v1/agent/self, error: method POST not allowed from=127.0.0.1:46360
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.553714 [DEBUG] http: Request POST /v1/agent/self (590.68µs) from=127.0.0.1:46360
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.556484 [ERR] http: Request DELETE /v1/agent/self, error: method DELETE not allowed from=127.0.0.1:46362
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.557020 [DEBUG] http: Request DELETE /v1/agent/self (542.346µs) from=127.0.0.1:46362
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.559711 [ERR] http: Request HEAD /v1/agent/self, error: method HEAD not allowed from=127.0.0.1:46364
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.559850 [DEBUG] http: Request HEAD /v1/agent/self (159.337µs) from=127.0.0.1:46364
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.561239 [DEBUG] http: Request OPTIONS /v1/agent/self (13.333µs) from=127.0.0.1:46364
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.562422 [ERR] http: Request GET /v1/agent/leave, error: method GET not allowed from=127.0.0.1:46364
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.562972 [DEBUG] http: Request GET /v1/agent/leave (550.012µs) from=127.0.0.1:46364
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.566444 [ERR] http: Request PUT /v1/agent/leave, error: Permission denied from=127.0.0.1:46366
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.566979 [DEBUG] http: Request PUT /v1/agent/leave (698.016µs) from=127.0.0.1:46366
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.570837 [ERR] http: Request POST /v1/agent/leave, error: method POST not allowed from=127.0.0.1:46368
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.571413 [DEBUG] http: Request POST /v1/agent/leave (587.347µs) from=127.0.0.1:46368
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.574621 [ERR] http: Request DELETE /v1/agent/leave, error: method DELETE not allowed from=127.0.0.1:46370
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.575288 [DEBUG] http: Request DELETE /v1/agent/leave (673.016µs) from=127.0.0.1:46370
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.579049 [ERR] http: Request HEAD /v1/agent/leave, error: method HEAD not allowed from=127.0.0.1:46372
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.579237 [DEBUG] http: Request HEAD /v1/agent/leave (160.004µs) from=127.0.0.1:46372
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.580934 [DEBUG] http: Request OPTIONS /v1/agent/leave (15.333µs) from=127.0.0.1:46372
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.583744 [DEBUG] http: Request GET /v1/connect/intentions (1.378698ms) from=127.0.0.1:46372
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.587362 [ERR] http: Request PUT /v1/connect/intentions, error: method PUT not allowed from=127.0.0.1:46374
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.588006 [DEBUG] http: Request PUT /v1/connect/intentions (656.015µs) from=127.0.0.1:46374
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.593958 [ERR] http: Request POST /v1/connect/intentions, error: Failed to decode request body: EOF from=127.0.0.1:46376
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.594591 [DEBUG] http: Request POST /v1/connect/intentions (669.016µs) from=127.0.0.1:46376
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.599361 [ERR] http: Request DELETE /v1/connect/intentions, error: method DELETE not allowed from=127.0.0.1:46378
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.600010 [DEBUG] http: Request DELETE /v1/connect/intentions (655.015µs) from=127.0.0.1:46378
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.603024 [ERR] http: Request HEAD /v1/connect/intentions, error: method HEAD not allowed from=127.0.0.1:46380
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.603168 [DEBUG] http: Request HEAD /v1/connect/intentions (217.672µs) from=127.0.0.1:46380
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.604928 [DEBUG] http: Request OPTIONS /v1/connect/intentions (15.334µs) from=127.0.0.1:46380
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.607420 [DEBUG] http: Request GET /v1/health/node/ (429.677µs) from=127.0.0.1:46380
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.611206 [ERR] http: Request PUT /v1/health/node/, error: method PUT not allowed from=127.0.0.1:46382
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.611890 [DEBUG] http: Request PUT /v1/health/node/ (585.68µs) from=127.0.0.1:46382
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.617134 [ERR] http: Request POST /v1/health/node/, error: method POST not allowed from=127.0.0.1:46384
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.617793 [DEBUG] http: Request POST /v1/health/node/ (655.682µs) from=127.0.0.1:46384
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.622484 [ERR] http: Request DELETE /v1/health/node/, error: method DELETE not allowed from=127.0.0.1:46386
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.623044 [DEBUG] http: Request DELETE /v1/health/node/ (563.346µs) from=127.0.0.1:46386
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.626147 [ERR] http: Request HEAD /v1/health/node/, error: method HEAD not allowed from=127.0.0.1:46388
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.626473 [DEBUG] http: Request HEAD /v1/health/node/ (349.675µs) from=127.0.0.1:46388
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.628262 [DEBUG] http: Request OPTIONS /v1/health/node/ (15.001µs) from=127.0.0.1:46388
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.630210 [DEBUG] http: Request GET /v1/health/service/ (421.677µs) from=127.0.0.1:46388
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.633509 [ERR] http: Request PUT /v1/health/service/, error: method PUT not allowed from=127.0.0.1:46390
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.634075 [DEBUG] http: Request PUT /v1/health/service/ (567.68µs) from=127.0.0.1:46390
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.637550 [ERR] http: Request POST /v1/health/service/, error: method POST not allowed from=127.0.0.1:46392
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.638253 [DEBUG] http: Request POST /v1/health/service/ (700.683µs) from=127.0.0.1:46392
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.641308 [ERR] http: Request DELETE /v1/health/service/, error: method DELETE not allowed from=127.0.0.1:46394
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.641890 [DEBUG] http: Request DELETE /v1/health/service/ (580.013µs) from=127.0.0.1:46394
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.644956 [ERR] http: Request HEAD /v1/health/service/, error: method HEAD not allowed from=127.0.0.1:46396
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.645095 [DEBUG] http: Request HEAD /v1/health/service/ (155.004µs) from=127.0.0.1:46396
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.647444 [DEBUG] http: Request OPTIONS /v1/health/service/ (17.667µs) from=127.0.0.1:46396
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.648987 [ERR] http: Request GET /v1/agent/token/, error: method GET not allowed from=127.0.0.1:46396
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.649615 [DEBUG] http: Request GET /v1/agent/token/ (606.681µs) from=127.0.0.1:46396
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.655634 [ERR] http: Request PUT /v1/agent/token/, error: Permission denied from=127.0.0.1:46398
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.656384 [DEBUG] http: Request PUT /v1/agent/token/ (885.354µs) from=127.0.0.1:46398
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.659746 [ERR] http: Request POST /v1/agent/token/, error: method POST not allowed from=127.0.0.1:46400
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.660475 [DEBUG] http: Request POST /v1/agent/token/ (711.35µs) from=127.0.0.1:46400
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.663714 [ERR] http: Request DELETE /v1/agent/token/, error: method DELETE not allowed from=127.0.0.1:46402
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.664596 [DEBUG] http: Request DELETE /v1/agent/token/ (894.354µs) from=127.0.0.1:46402
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.692736 [ERR] http: Request HEAD /v1/agent/token/, error: method HEAD not allowed from=127.0.0.1:46404
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.692887 [DEBUG] http: Request HEAD /v1/agent/token/ (169.338µs) from=127.0.0.1:46404
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.695394 [DEBUG] http: Request OPTIONS /v1/agent/token/ (14µs) from=127.0.0.1:46404
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.697287 [ERR] http: Request GET /v1/config, error: method GET not allowed from=127.0.0.1:46404
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.697927 [DEBUG] http: Request GET /v1/config (643.348µs) from=127.0.0.1:46404
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.701113 [ERR] http: Request PUT /v1/config, error: Bad request: Request decoding failed: EOF from=127.0.0.1:46406
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.701693 [DEBUG] http: Request PUT /v1/config (611.347µs) from=127.0.0.1:46406
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.704819 [ERR] http: Request POST /v1/config, error: method POST not allowed from=127.0.0.1:46408
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.705374 [DEBUG] http: Request POST /v1/config (563.013µs) from=127.0.0.1:46408
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.708369 [ERR] http: Request DELETE /v1/config, error: method DELETE not allowed from=127.0.0.1:46410
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.708859 [DEBUG] http: Request DELETE /v1/config (508.012µs) from=127.0.0.1:46410
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.712927 [ERR] http: Request HEAD /v1/config, error: method HEAD not allowed from=127.0.0.1:46412
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.713076 [DEBUG] http: Request HEAD /v1/config (162.671µs) from=127.0.0.1:46412
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/config
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.714685 [DEBUG] http: Request OPTIONS /v1/config (13.667µs) from=127.0.0.1:46412
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.718574 [ERR] http: Request GET /v1/connect/ca/configuration, error: Permission denied from=127.0.0.1:46412
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.719257 [DEBUG] http: Request GET /v1/connect/ca/configuration (1.056691ms) from=127.0.0.1:46412
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.723667 [DEBUG] http: Request PUT /v1/connect/ca/configuration (454.677µs) from=127.0.0.1:46414
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.726872 [ERR] http: Request POST /v1/connect/ca/configuration, error: method POST not allowed from=127.0.0.1:46416
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.727797 [DEBUG] http: Request POST /v1/connect/ca/configuration (921.021µs) from=127.0.0.1:46416
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.731025 [ERR] http: Request DELETE /v1/connect/ca/configuration, error: method DELETE not allowed from=127.0.0.1:46418
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.731896 [DEBUG] http: Request DELETE /v1/connect/ca/configuration (876.687µs) from=127.0.0.1:46418
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.734961 [ERR] http: Request HEAD /v1/connect/ca/configuration, error: method HEAD not allowed from=127.0.0.1:46420
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.735106 [DEBUG] http: Request HEAD /v1/connect/ca/configuration (161.671µs) from=127.0.0.1:46420
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.736488 [DEBUG] http: Request OPTIONS /v1/connect/ca/configuration (12.667µs) from=127.0.0.1:46420
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.737015 [INFO] agent: Requesting shutdown
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.737072 [INFO] consul: shutting down server
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:00.737118 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.066736 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.166929 [INFO] manager: shutting down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.167697 [INFO] agent: consul server down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.167751 [INFO] agent: shutdown complete
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.167818 [INFO] agent: Stopping DNS server 127.0.0.1:34133 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.167948 [INFO] agent: Stopping DNS server 127.0.0.1:34133 (udp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.168095 [INFO] agent: Stopping HTTP server 127.0.0.1:34134 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.168510 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/12/06 06:02:01.168634 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_MethodNotAllowed_OSS (8.16s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query (0.03s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query (0.03s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/execute (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/explain (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/name/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/name/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/create (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/check (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/check (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/maintenance (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/config/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/keyring (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/deregister (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/fail/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/role/name/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/roots (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/txn (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/force-leave/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/configuration (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/auth-method/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/host (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/host (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/checks (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/leaf/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/pass/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/login (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/join/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/roots (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/roots (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/id/ (0.05s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/id/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/id/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/node/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/info/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/health (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/node/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/replication (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/deregister/ (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/deregister/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/renew/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/renew/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/renew/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/update (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/node/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/token/ (0.03s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/configuration (0.00s)
=== RUN   TestHTTPAPI_OptionMethod_OSS
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:01.285466 [WARN] agent: Node name "Node e6e18433-2443-4db8-af11-8142a0422cd6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:01.286029 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:01.288530 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e6e18433-2443-4db8-af11-8142a0422cd6 Address:127.0.0.1:34144}]
2019/12/06 06:02:02 [INFO]  raft: Node at 127.0.0.1:34144 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.230505 [INFO] serf: EventMemberJoin: Node e6e18433-2443-4db8-af11-8142a0422cd6.dc1 127.0.0.1
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.240927 [INFO] serf: EventMemberJoin: Node e6e18433-2443-4db8-af11-8142a0422cd6 127.0.0.1
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.245001 [INFO] consul: Adding LAN server Node e6e18433-2443-4db8-af11-8142a0422cd6 (Addr: tcp/127.0.0.1:34144) (DC: dc1)
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.245472 [INFO] consul: Handled member-join event for server "Node e6e18433-2443-4db8-af11-8142a0422cd6.dc1" in area "wan"
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.246861 [INFO] agent: Started DNS server 127.0.0.1:34139 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.247328 [INFO] agent: Started DNS server 127.0.0.1:34139 (udp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.249670 [INFO] agent: Started HTTP server on 127.0.0.1:34140 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.249759 [INFO] agent: started state syncer
2019/12/06 06:02:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:02 [INFO]  raft: Node at 127.0.0.1:34144 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:02 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:02 [INFO]  raft: Node at 127.0.0.1:34144 [Leader] entering Leader state
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.708874 [INFO] consul: cluster leadership acquired
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.709384 [INFO] consul: New leader elected: Node e6e18433-2443-4db8-af11-8142a0422cd6
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.876235 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:02.997045 [INFO] acl: initializing acls
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.043025 [INFO] acl: initializing acls
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.200960 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.361618 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.577063 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.577179 [DEBUG] acl: transitioning out of legacy ACL mode
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.578032 [INFO] serf: EventMemberUpdate: Node e6e18433-2443-4db8-af11-8142a0422cd6
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.578737 [INFO] serf: EventMemberUpdate: Node e6e18433-2443-4db8-af11-8142a0422cd6.dc1
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.734642 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.735487 [INFO] serf: EventMemberUpdate: Node e6e18433-2443-4db8-af11-8142a0422cd6
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:03.736125 [INFO] serf: EventMemberUpdate: Node e6e18433-2443-4db8-af11-8142a0422cd6.dc1
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.768166 [INFO] agent: Synced node info
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.768303 [DEBUG] agent: Node info in sync
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.768399 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.768815 [DEBUG] consul: Skipping self join check for "Node e6e18433-2443-4db8-af11-8142a0422cd6" since the cluster is too small
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.768971 [INFO] consul: member 'Node e6e18433-2443-4db8-af11-8142a0422cd6' joined, marking health alive
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.936350 [DEBUG] consul: Skipping self join check for "Node e6e18433-2443-4db8-af11-8142a0422cd6" since the cluster is too small
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.936832 [DEBUG] consul: Skipping self join check for "Node e6e18433-2443-4db8-af11-8142a0422cd6" since the cluster is too small
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.960376 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/query (19.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.961417 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/query/ (564.347µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/execute
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.962048 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/query/xxx/execute (96.003µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/explain
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.962608 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/query/xxx/explain (89.002µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/tokens
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.963077 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/tokens (14.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/maintenance
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.963538 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/maintenance (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/register
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.963963 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/check/register (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/config/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.964439 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/config/ (15.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.964868 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/connect/intentions/ (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/keyring
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.965332 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/operator/keyring (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.965750 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/role/ (15.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-methods
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.966195 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/auth-methods (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.966663 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/token (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/proxy/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.967155 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/connect/proxy/ (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.967614 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/operator/raft/configuration (23.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-method
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.968052 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/auth-method (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/self
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.968480 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/token/self (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/datacenters
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.968948 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/coordinate/datacenters (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/update
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.969710 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/coordinate/update (16.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/state/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.970740 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/health/state/ (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/deregister
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.971386 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/catalog/deregister (13.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/roles
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.971815 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/roles (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/fail/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.972243 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/check/fail/ (16.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/info/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.972740 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/session/info/ (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.973193 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/role (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role/name/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.973618 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/role/name/ (15.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.974043 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/rules/translate/ (14.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/roots
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.974533 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/connect/ca/roots (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/txn
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.975221 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/txn (16.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/force-leave/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.975976 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/force-leave/ (47.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/update/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.976799 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/check/update/ (15.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/kv/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.977569 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/kv/ (16µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/peer
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.978258 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/operator/raft/peer (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policies
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.978736 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/policies (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.979279 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/coordinate/node/ (13.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.982104 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/operator/autopilot/configuration (16.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-method/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.982588 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/auth-method/ (16.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/host
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.983046 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/host (13.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/checks
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.983517 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/checks (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/leaf/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.983963 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/connect/ca/leaf/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/members
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.984586 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/members (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/pass/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.985018 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/check/pass/ (11.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/authorize
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.985417 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/connect/authorize (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/bootstrap
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.986058 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/bootstrap (17.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/login
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.986498 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/login (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/join/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.986899 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/join/ (11.666µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/roots
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.987340 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/connect/ca/roots (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.987763 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/coordinate/nodes (11.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/list
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.988159 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/event/list (11.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/clone/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.988585 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/clone/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/id/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.988982 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/health/service/id/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/fire/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.989509 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/event/fire/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/list
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.990236 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/session/list (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rules
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.990866 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/binding-rules (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/maintenance/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.991455 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/service/maintenance/ (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/checks/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.992020 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/health/checks/ (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.992601 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/internal/ui/node/ (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/health
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.993198 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/operator/autopilot/health (11.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/info/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.993830 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/info/ (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/service/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.994456 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/catalog/service/ (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/destroy/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.995161 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/session/destroy/ (64.002µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.995610 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/session/node/ (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/replication
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.996063 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/replication (11.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/services
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.996495 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/internal/ui/services (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/connect/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.996908 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/catalog/connect/ (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.997342 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/policy (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.997753 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/token/ (14.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/warn/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.998159 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/check/warn/ (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/deregister/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.998679 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/service/deregister/ (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/register
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:04.999422 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/catalog/register (16.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/connect/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.000004 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/health/connect/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.000443 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/internal/ui/nodes (14.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/create
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.000906 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/session/create (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/deregister/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.001312 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/check/deregister/ (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/services
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.001707 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/catalog/services (11µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/renew/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.002112 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/session/renew/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/service/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.002543 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/health/service/ (13.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/update
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.003033 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/update (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/self
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.003491 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/self (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/leave
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.003920 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/leave (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.004377 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/connect/intentions (14.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.004793 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/health/node/ (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/config
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.005186 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/config (11.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/token/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.005590 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/token/ (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.006016 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/connect/ca/configuration (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/services
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.008671 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/services (16.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/name/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.012586 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/health/service/name/ (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/status/leader
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.013085 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/status/leader (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/datacenters
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.013560 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/catalog/datacenters (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/match
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.014014 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/connect/intentions/match (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/destroy/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.014486 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/destroy/ (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/list
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.014896 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/list (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.015562 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/rules/translate (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/metrics
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.016004 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/metrics (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.016448 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/service/ (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/register
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.016878 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/agent/service/register (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/snapshot
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.017276 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/snapshot (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/logout
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.017674 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/logout (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/create
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.018103 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/create (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rule
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.018523 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/binding-rule (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/check
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.019017 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/connect/intentions/check (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.019504 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/policy/ (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rule/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.019933 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/acl/binding-rule/ (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.020334 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/catalog/nodes (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/node/
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.020735 [DEBUG] http: Request OPTIONS http://127.0.0.1:34140/v1/catalog/node/ (13µs) from=
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.020870 [INFO] agent: Requesting shutdown
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.020931 [INFO] consul: shutting down server
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.020974 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.075155 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.125251 [INFO] manager: shutting down
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.126193 [INFO] agent: consul server down
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.126335 [INFO] agent: shutdown complete
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.126434 [INFO] agent: Stopping DNS server 127.0.0.1:34139 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.126825 [INFO] agent: Stopping DNS server 127.0.0.1:34139 (udp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.126993 [INFO] agent: Stopping HTTP server 127.0.0.1:34140 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.127222 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_OptionMethod_OSS - 2019/12/06 06:02:05.127293 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_OptionMethod_OSS (3.96s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-methods (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/roles (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/role/name/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rules (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/node/ (0.00s)
=== RUN   TestHTTPAPI_AllowedNets_OSS
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.191618 [WARN] agent: Node name "Node d28ab080-32d9-8b6e-016d-0444173796bc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.192233 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.194635 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d28ab080-32d9-8b6e-016d-0444173796bc Address:127.0.0.1:34150}]
2019/12/06 06:02:05 [INFO]  raft: Node at 127.0.0.1:34150 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.870662 [INFO] serf: EventMemberJoin: Node d28ab080-32d9-8b6e-016d-0444173796bc.dc1 127.0.0.1
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.873792 [INFO] serf: EventMemberJoin: Node d28ab080-32d9-8b6e-016d-0444173796bc 127.0.0.1
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.875303 [INFO] consul: Handled member-join event for server "Node d28ab080-32d9-8b6e-016d-0444173796bc.dc1" in area "wan"
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.875400 [INFO] consul: Adding LAN server Node d28ab080-32d9-8b6e-016d-0444173796bc (Addr: tcp/127.0.0.1:34150) (DC: dc1)
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.877174 [INFO] agent: Started DNS server 127.0.0.1:34145 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.877763 [INFO] agent: Started DNS server 127.0.0.1:34145 (udp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.880421 [INFO] agent: Started HTTP server on 127.0.0.1:34146 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:05.880521 [INFO] agent: started state syncer
2019/12/06 06:02:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:05 [INFO]  raft: Node at 127.0.0.1:34150 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:06 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:06 [INFO]  raft: Node at 127.0.0.1:34150 [Leader] entering Leader state
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:06.375567 [INFO] consul: cluster leadership acquired
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:06.376015 [INFO] consul: New leader elected: Node d28ab080-32d9-8b6e-016d-0444173796bc
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:06.627613 [INFO] acl: initializing acls
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:06.652986 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:06.858918 [INFO] acl: initializing acls
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:06.859450 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:07.251077 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:07.251865 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:07.251980 [DEBUG] acl: transitioning out of legacy ACL mode
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:07.252982 [INFO] serf: EventMemberUpdate: Node d28ab080-32d9-8b6e-016d-0444173796bc
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:07.253782 [INFO] serf: EventMemberUpdate: Node d28ab080-32d9-8b6e-016d-0444173796bc.dc1
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:07.409683 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:07.410709 [INFO] serf: EventMemberUpdate: Node d28ab080-32d9-8b6e-016d-0444173796bc
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:07.411424 [INFO] serf: EventMemberUpdate: Node d28ab080-32d9-8b6e-016d-0444173796bc.dc1
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.326106 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.326569 [DEBUG] consul: Skipping self join check for "Node d28ab080-32d9-8b6e-016d-0444173796bc" since the cluster is too small
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.326663 [INFO] consul: member 'Node d28ab080-32d9-8b6e-016d-0444173796bc' joined, marking health alive
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.602692 [DEBUG] consul: Skipping self join check for "Node d28ab080-32d9-8b6e-016d-0444173796bc" since the cluster is too small
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.603214 [DEBUG] consul: Skipping self join check for "Node d28ab080-32d9-8b6e-016d-0444173796bc" since the cluster is too small
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/destroy/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.625518 [ERR] http: Request PUT http://127.0.0.1:34146/v1/session/destroy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.625645 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/session/destroy/ (143.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.626183 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/policy, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.626272 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/policy (93.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.626781 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.626872 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/token/ (96.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/token/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.627429 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/acl/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.627517 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/acl/token/ (97.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/warn/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.628004 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/check/warn/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.628090 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/check/warn/ (90.668µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/deregister/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.628618 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/service/deregister/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.628706 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/service/deregister/ (91.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/register
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.629429 [ERR] http: Request PUT http://127.0.0.1:34146/v1/catalog/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.629604 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/catalog/register (419.009µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/create
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.630481 [ERR] http: Request PUT http://127.0.0.1:34146/v1/session/create, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.630577 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/session/create (100.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/renew/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.631072 [ERR] http: Request PUT http://127.0.0.1:34146/v1/session/renew/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.631218 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/session/renew/ (149.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/deregister/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.631755 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/check/deregister/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.631841 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/check/deregister/ (92.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/update
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.632327 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/update, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.632414 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/update (91.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/leave
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.632883 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/leave, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.632968 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/leave (89.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/connect/intentions
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.633434 [ERR] http: Request POST http://127.0.0.1:34146/v1/connect/intentions, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.633570 [DEBUG] http: Request POST http://127.0.0.1:34146/v1/connect/intentions (91.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/config
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.634090 [ERR] http: Request PUT http://127.0.0.1:34146/v1/config, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.634273 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/config (92.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/token/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.634823 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.634918 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/token/ (104.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/ca/configuration
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.635409 [ERR] http: Request PUT http://127.0.0.1:34146/v1/connect/ca/configuration, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.635493 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/connect/ca/configuration (92.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/register
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.635991 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/service/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.636074 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/service/register (91.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/snapshot
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.641166 [ERR] http: Request PUT http://127.0.0.1:34146/v1/snapshot, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.641291 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/snapshot (132.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/destroy/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.641937 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/destroy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.642026 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/destroy/ (95.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/rules/translate
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.642513 [ERR] http: Request POST http://127.0.0.1:34146/v1/acl/rules/translate, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.642599 [DEBUG] http: Request POST http://127.0.0.1:34146/v1/acl/rules/translate (90.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/logout
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.643116 [ERR] http: Request POST http://127.0.0.1:34146/v1/acl/logout, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.643206 [DEBUG] http: Request POST http://127.0.0.1:34146/v1/acl/logout (92.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/create
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.643698 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/create, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.643783 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/create (88.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/binding-rule
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.644363 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/binding-rule, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.644458 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/binding-rule (98.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.644956 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/policy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.645049 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/policy/ (101.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/policy/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.645515 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/acl/policy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.645603 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/acl/policy/ (91.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/binding-rule/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.646156 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/binding-rule/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.646244 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/binding-rule/ (93.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/binding-rule/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.646721 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/acl/binding-rule/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.646824 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/acl/binding-rule/ (108.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/config/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.661323 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/config/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.661445 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/config/ (129.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/intentions/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.662044 [ERR] http: Request PUT http://127.0.0.1:34146/v1/connect/intentions/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.662137 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/connect/intentions/ (99.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/connect/intentions/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.662664 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/connect/intentions/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.662751 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/connect/intentions/ (93.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/maintenance
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.663221 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/maintenance, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.663306 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/maintenance (88.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/register
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.663765 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/check/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.663899 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/check/register (135.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.664530 [ERR] http: Request POST http://127.0.0.1:34146/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.664620 [DEBUG] http: Request POST http://127.0.0.1:34146/v1/operator/keyring (95.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.665136 [ERR] http: Request PUT http://127.0.0.1:34146/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.665225 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/operator/keyring (91.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.665708 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.665791 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/operator/keyring (87.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/role/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.666269 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/role/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.666354 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/role/ (88.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/role/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.666807 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/acl/role/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.666891 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/acl/role/ (86.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.667337 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/token, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.667426 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/token (94.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/auth-method
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.667921 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/auth-method, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.668004 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/auth-method (87.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/coordinate/update
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.668460 [ERR] http: Request PUT http://127.0.0.1:34146/v1/coordinate/update, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.668544 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/coordinate/update (87.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/deregister
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.669050 [ERR] http: Request PUT http://127.0.0.1:34146/v1/catalog/deregister, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.669139 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/catalog/deregister (88.668µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/fail/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.679971 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/check/fail/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.680077 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/check/fail/ (113.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/role
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.680727 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/role, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.680819 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/role (95.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/txn
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.689515 [ERR] http: Request PUT http://127.0.0.1:34146/v1/txn, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.689649 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/txn (141.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/raft/peer
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.690303 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/operator/raft/peer, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.690399 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/operator/raft/peer (96.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/force-leave/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.690914 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/force-leave/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.691003 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/force-leave/ (90.336µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/update/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.691475 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/check/update/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.691560 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/check/update/ (85.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/kv/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.692106 [ERR] http: Request PUT http://127.0.0.1:34146/v1/kv/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.692208 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/kv/ (108.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/kv/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.693152 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/kv/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.693244 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/kv/ (99.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/autopilot/configuration
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.693739 [ERR] http: Request PUT http://127.0.0.1:34146/v1/operator/autopilot/configuration, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.693825 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/operator/autopilot/configuration (92.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/auth-method/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.694521 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/auth-method/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.694607 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/auth-method/ (93.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/auth-method/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.695053 [ERR] http: Request DELETE http://127.0.0.1:34146/v1/acl/auth-method/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.695139 [DEBUG] http: Request DELETE http://127.0.0.1:34146/v1/acl/auth-method/ (91.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/pass/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.695656 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/check/pass/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.695736 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/check/pass/ (84.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/agent/connect/authorize
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.696214 [ERR] http: Request POST http://127.0.0.1:34146/v1/agent/connect/authorize, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.696298 [DEBUG] http: Request POST http://127.0.0.1:34146/v1/agent/connect/authorize (88.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/bootstrap
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.696734 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/bootstrap, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.696814 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/bootstrap (84.668µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/login
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.697247 [ERR] http: Request POST http://127.0.0.1:34146/v1/acl/login, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.697328 [DEBUG] http: Request POST http://127.0.0.1:34146/v1/acl/login (84.002µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/join/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.697869 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/join/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.697984 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/join/ (89.335µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/clone/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.698438 [ERR] http: Request PUT http://127.0.0.1:34146/v1/acl/clone/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.698525 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/acl/clone/ (90.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/event/fire/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.699010 [ERR] http: Request PUT http://127.0.0.1:34146/v1/event/fire/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.699093 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/event/fire/ (90.669µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/maintenance/
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.703624 [ERR] http: Request PUT http://127.0.0.1:34146/v1/agent/service/maintenance/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.703721 [DEBUG] http: Request PUT http://127.0.0.1:34146/v1/agent/service/maintenance/ (108.669µs) from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.703925 [INFO] agent: Requesting shutdown
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.703991 [INFO] consul: shutting down server
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.704038 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.804936 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:08.908578 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.083678 [INFO] manager: shutting down
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.084978 [INFO] agent: consul server down
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.085059 [INFO] agent: shutdown complete
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.085148 [INFO] agent: Stopping DNS server 127.0.0.1:34145 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.085344 [INFO] agent: Stopping DNS server 127.0.0.1:34145 (udp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.085623 [INFO] agent: Synced node info
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.085626 [INFO] agent: Stopping HTTP server 127.0.0.1:34146 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.085996 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_AllowedNets_OSS - 2019/12/06 06:02:09.086072 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_AllowedNets_OSS (3.96s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/config (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/logout (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/binding-rule (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/binding-rule/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/binding-rule/ (0.01s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/config/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/role/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/auth-method (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/deregister (0.01s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/role (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/txn (0.01s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/auth-method/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/login (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/maintenance/ (0.00s)
=== RUN   TestHTTPServer_UnixSocket
=== PAUSE TestHTTPServer_UnixSocket
=== RUN   TestHTTPServer_UnixSocket_FileExists
=== PAUSE TestHTTPServer_UnixSocket_FileExists
=== RUN   TestHTTPServer_H2
--- SKIP: TestHTTPServer_H2 (0.00s)
    http_test.go:131: DM-skipped
=== RUN   TestSetIndex
=== PAUSE TestSetIndex
=== RUN   TestSetKnownLeader
=== PAUSE TestSetKnownLeader
=== RUN   TestSetLastContact
=== PAUSE TestSetLastContact
=== RUN   TestSetMeta
=== PAUSE TestSetMeta
=== RUN   TestHTTPAPI_BlockEndpoints
=== PAUSE TestHTTPAPI_BlockEndpoints
=== RUN   TestHTTPAPI_Ban_Nonprintable_Characters
--- SKIP: TestHTTPAPI_Ban_Nonprintable_Characters (0.00s)
    http_test.go:324: DM-skipped
=== RUN   TestHTTPAPI_Allow_Nonprintable_Characters_With_Flag
--- SKIP: TestHTTPAPI_Allow_Nonprintable_Characters_With_Flag (0.00s)
    http_test.go:344: DM-skipped
=== RUN   TestHTTPAPI_TranslateAddrHeader
=== PAUSE TestHTTPAPI_TranslateAddrHeader
=== RUN   TestHTTPAPIResponseHeaders
=== PAUSE TestHTTPAPIResponseHeaders
=== RUN   TestContentTypeIsJSON
=== PAUSE TestContentTypeIsJSON
=== RUN   TestHTTP_wrap_obfuscateLog
=== PAUSE TestHTTP_wrap_obfuscateLog
=== RUN   TestPrettyPrint
=== PAUSE TestPrettyPrint
=== RUN   TestPrettyPrintBare
=== PAUSE TestPrettyPrintBare
=== RUN   TestParseSource
=== PAUSE TestParseSource
=== RUN   TestParseCacheControl
=== RUN   TestParseCacheControl/empty_header
=== RUN   TestParseCacheControl/simple_max-age
=== RUN   TestParseCacheControl/zero_max-age
=== RUN   TestParseCacheControl/must-revalidate
=== RUN   TestParseCacheControl/mixes_age,_must-revalidate
=== RUN   TestParseCacheControl/quoted_max-age
=== RUN   TestParseCacheControl/mixed_case_max-age
=== RUN   TestParseCacheControl/simple_stale-if-error
=== RUN   TestParseCacheControl/combined_with_space
=== RUN   TestParseCacheControl/combined_no_space
=== RUN   TestParseCacheControl/unsupported_directive
=== RUN   TestParseCacheControl/mixed_unsupported_directive
=== RUN   TestParseCacheControl/garbage_value
=== RUN   TestParseCacheControl/garbage_value_with_quotes
--- PASS: TestParseCacheControl (0.01s)
    --- PASS: TestParseCacheControl/empty_header (0.00s)
    --- PASS: TestParseCacheControl/simple_max-age (0.00s)
    --- PASS: TestParseCacheControl/zero_max-age (0.00s)
    --- PASS: TestParseCacheControl/must-revalidate (0.00s)
    --- PASS: TestParseCacheControl/mixes_age,_must-revalidate (0.00s)
    --- PASS: TestParseCacheControl/quoted_max-age (0.00s)
    --- PASS: TestParseCacheControl/mixed_case_max-age (0.00s)
    --- PASS: TestParseCacheControl/simple_stale-if-error (0.00s)
    --- PASS: TestParseCacheControl/combined_with_space (0.00s)
    --- PASS: TestParseCacheControl/combined_no_space (0.00s)
    --- PASS: TestParseCacheControl/unsupported_directive (0.00s)
    --- PASS: TestParseCacheControl/mixed_unsupported_directive (0.00s)
    --- PASS: TestParseCacheControl/garbage_value (0.00s)
    --- PASS: TestParseCacheControl/garbage_value_with_quotes (0.00s)
=== RUN   TestParseWait
=== PAUSE TestParseWait
=== RUN   TestPProfHandlers_EnableDebug
=== PAUSE TestPProfHandlers_EnableDebug
=== RUN   TestPProfHandlers_DisableDebugNoACLs
--- SKIP: TestPProfHandlers_DisableDebugNoACLs (0.00s)
    http_test.go:761: DM-skipped
=== RUN   TestPProfHandlers_ACLs
=== PAUSE TestPProfHandlers_ACLs
=== RUN   TestParseWait_InvalidTime
=== PAUSE TestParseWait_InvalidTime
=== RUN   TestParseWait_InvalidIndex
=== PAUSE TestParseWait_InvalidIndex
=== RUN   TestParseConsistency
=== PAUSE TestParseConsistency
=== RUN   TestParseConsistencyAndMaxStale
WARNING: bootstrap = true: do not enable unless necessary
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:09.162881 [WARN] agent: Node name "Node 951d7006-b718-bb35-a090-e66b3f9dac2b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:09.163394 [DEBUG] tlsutil: Update with version 1
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:09.165824 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:951d7006-b718-bb35-a090-e66b3f9dac2b Address:127.0.0.1:34156}]
2019/12/06 06:02:10 [INFO]  raft: Node at 127.0.0.1:34156 [Follower] entering Follower state (Leader: "")
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.124129 [INFO] serf: EventMemberJoin: Node 951d7006-b718-bb35-a090-e66b3f9dac2b.dc1 127.0.0.1
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.136241 [INFO] serf: EventMemberJoin: Node 951d7006-b718-bb35-a090-e66b3f9dac2b 127.0.0.1
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.139947 [INFO] consul: Adding LAN server Node 951d7006-b718-bb35-a090-e66b3f9dac2b (Addr: tcp/127.0.0.1:34156) (DC: dc1)
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.140496 [INFO] consul: Handled member-join event for server "Node 951d7006-b718-bb35-a090-e66b3f9dac2b.dc1" in area "wan"
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.140616 [INFO] agent: Started DNS server 127.0.0.1:34151 (udp)
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.141011 [INFO] agent: Started DNS server 127.0.0.1:34151 (tcp)
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.145376 [INFO] agent: Started HTTP server on 127.0.0.1:34152 (tcp)
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.145470 [INFO] agent: started state syncer
2019/12/06 06:02:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:10 [INFO]  raft: Node at 127.0.0.1:34156 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:10 [INFO]  raft: Node at 127.0.0.1:34156 [Leader] entering Leader state
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.751960 [INFO] consul: cluster leadership acquired
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.752430 [INFO] consul: New leader elected: Node 951d7006-b718-bb35-a090-e66b3f9dac2b
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.915999 [INFO] agent: Requesting shutdown
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.916121 [INFO] consul: shutting down server
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:10.916201 [WARN] serf: Shutdown without a Leave
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.025299 [WARN] serf: Shutdown without a Leave
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.108800 [INFO] manager: shutting down
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.186087 [INFO] agent: consul server down
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.186170 [INFO] agent: shutdown complete
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.186415 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.186463 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.186518 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.187449 [INFO] agent: Stopping DNS server 127.0.0.1:34151 (tcp)
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.188012 [INFO] agent: Stopping DNS server 127.0.0.1:34151 (udp)
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.188264 [INFO] agent: Stopping HTTP server 127.0.0.1:34152 (tcp)
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.188525 [INFO] agent: Waiting for endpoints to shut down
TestParseConsistencyAndMaxStale - 2019/12/06 06:02:11.188891 [INFO] agent: Endpoints down
--- PASS: TestParseConsistencyAndMaxStale (2.08s)
=== RUN   TestParseConsistency_Invalid
=== PAUSE TestParseConsistency_Invalid
=== RUN   TestACLResolution
=== PAUSE TestACLResolution
=== RUN   TestEnableWebUI
=== PAUSE TestEnableWebUI
=== RUN   TestParseToken_ProxyTokenResolve
=== PAUSE TestParseToken_ProxyTokenResolve
=== RUN   TestAllowedNets
--- SKIP: TestAllowedNets (0.00s)
    http_test.go:1227: DM-skipped
=== RUN   TestIntentionsList_empty
=== PAUSE TestIntentionsList_empty
=== RUN   TestIntentionsList_values
=== PAUSE TestIntentionsList_values
=== RUN   TestIntentionsMatch_basic
=== PAUSE TestIntentionsMatch_basic
=== RUN   TestIntentionsMatch_noBy
=== PAUSE TestIntentionsMatch_noBy
=== RUN   TestIntentionsMatch_byInvalid
=== PAUSE TestIntentionsMatch_byInvalid
=== RUN   TestIntentionsMatch_noName
=== PAUSE TestIntentionsMatch_noName
=== RUN   TestIntentionsCheck_basic
=== PAUSE TestIntentionsCheck_basic
=== RUN   TestIntentionsCheck_noSource
=== PAUSE TestIntentionsCheck_noSource
=== RUN   TestIntentionsCheck_noDestination
=== PAUSE TestIntentionsCheck_noDestination
=== RUN   TestIntentionsCreate_good
=== PAUSE TestIntentionsCreate_good
=== RUN   TestIntentionsCreate_noBody
=== PAUSE TestIntentionsCreate_noBody
=== RUN   TestIntentionsSpecificGet_good
=== PAUSE TestIntentionsSpecificGet_good
=== RUN   TestIntentionsSpecificGet_invalidId
=== PAUSE TestIntentionsSpecificGet_invalidId
=== RUN   TestIntentionsSpecificUpdate_good
=== PAUSE TestIntentionsSpecificUpdate_good
=== RUN   TestIntentionsSpecificDelete_good
=== PAUSE TestIntentionsSpecificDelete_good
=== RUN   TestParseIntentionMatchEntry
=== RUN   TestParseIntentionMatchEntry/foo
=== RUN   TestParseIntentionMatchEntry/foo/bar
=== RUN   TestParseIntentionMatchEntry/foo/bar/baz
--- PASS: TestParseIntentionMatchEntry (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo/bar (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo/bar/baz (0.00s)
=== RUN   TestAgent_LoadKeyrings
=== PAUSE TestAgent_LoadKeyrings
=== RUN   TestAgent_InmemKeyrings
=== PAUSE TestAgent_InmemKeyrings
=== RUN   TestAgent_InitKeyring
=== PAUSE TestAgent_InitKeyring
=== RUN   TestAgentKeyring_ACL
=== PAUSE TestAgentKeyring_ACL
=== RUN   TestKVSEndpoint_PUT_GET_DELETE
=== PAUSE TestKVSEndpoint_PUT_GET_DELETE
=== RUN   TestKVSEndpoint_Recurse
=== PAUSE TestKVSEndpoint_Recurse
=== RUN   TestKVSEndpoint_DELETE_CAS
=== PAUSE TestKVSEndpoint_DELETE_CAS
=== RUN   TestKVSEndpoint_CAS
=== PAUSE TestKVSEndpoint_CAS
=== RUN   TestKVSEndpoint_ListKeys
=== PAUSE TestKVSEndpoint_ListKeys
=== RUN   TestKVSEndpoint_AcquireRelease
=== PAUSE TestKVSEndpoint_AcquireRelease
=== RUN   TestKVSEndpoint_GET_Raw
=== PAUSE TestKVSEndpoint_GET_Raw
=== RUN   TestKVSEndpoint_PUT_ConflictingFlags
=== PAUSE TestKVSEndpoint_PUT_ConflictingFlags
=== RUN   TestKVSEndpoint_DELETE_ConflictingFlags
=== PAUSE TestKVSEndpoint_DELETE_ConflictingFlags
=== RUN   TestNotifyGroup
--- PASS: TestNotifyGroup (0.00s)
=== RUN   TestNotifyGroup_Clear
--- PASS: TestNotifyGroup_Clear (0.00s)
=== RUN   TestOperator_RaftConfiguration
=== PAUSE TestOperator_RaftConfiguration
=== RUN   TestOperator_RaftPeer
=== PAUSE TestOperator_RaftPeer
=== RUN   TestOperator_KeyringInstall
=== PAUSE TestOperator_KeyringInstall
=== RUN   TestOperator_KeyringList
=== PAUSE TestOperator_KeyringList
=== RUN   TestOperator_KeyringRemove
=== PAUSE TestOperator_KeyringRemove
=== RUN   TestOperator_KeyringUse
=== PAUSE TestOperator_KeyringUse
=== RUN   TestOperator_Keyring_InvalidRelayFactor
=== PAUSE TestOperator_Keyring_InvalidRelayFactor
=== RUN   TestOperator_AutopilotGetConfiguration
=== PAUSE TestOperator_AutopilotGetConfiguration
=== RUN   TestOperator_AutopilotSetConfiguration
--- SKIP: TestOperator_AutopilotSetConfiguration (0.00s)
    operator_endpoint_test.go:318: DM-skipped
=== RUN   TestOperator_AutopilotCASConfiguration
=== PAUSE TestOperator_AutopilotCASConfiguration
=== RUN   TestOperator_ServerHealth
=== PAUSE TestOperator_ServerHealth
=== RUN   TestOperator_ServerHealth_Unhealthy
=== PAUSE TestOperator_ServerHealth_Unhealthy
=== RUN   TestPreparedQuery_Create
=== PAUSE TestPreparedQuery_Create
=== RUN   TestPreparedQuery_List
=== PAUSE TestPreparedQuery_List
=== RUN   TestPreparedQuery_Execute
=== PAUSE TestPreparedQuery_Execute
=== RUN   TestPreparedQuery_ExecuteCached
=== PAUSE TestPreparedQuery_ExecuteCached
=== RUN   TestPreparedQuery_Explain
=== PAUSE TestPreparedQuery_Explain
=== RUN   TestPreparedQuery_Get
=== PAUSE TestPreparedQuery_Get
=== RUN   TestPreparedQuery_Update
=== PAUSE TestPreparedQuery_Update
=== RUN   TestPreparedQuery_Delete
=== PAUSE TestPreparedQuery_Delete
=== RUN   TestPreparedQuery_parseLimit
=== PAUSE TestPreparedQuery_parseLimit
=== RUN   TestPreparedQuery_Integration
--- SKIP: TestPreparedQuery_Integration (0.00s)
    prepared_query_endpoint_test.go:990: DM-skipped
=== RUN   TestRexecWriter
--- SKIP: TestRexecWriter (0.00s)
    remote_exec_test.go:28: DM-skipped
=== RUN   TestRemoteExecGetSpec
=== PAUSE TestRemoteExecGetSpec
=== RUN   TestRemoteExecGetSpec_ACLToken
=== PAUSE TestRemoteExecGetSpec_ACLToken
=== RUN   TestRemoteExecGetSpec_ACLAgentToken
=== PAUSE TestRemoteExecGetSpec_ACLAgentToken
=== RUN   TestRemoteExecGetSpec_ACLDeny
=== PAUSE TestRemoteExecGetSpec_ACLDeny
=== RUN   TestRemoteExecWrites
=== PAUSE TestRemoteExecWrites
=== RUN   TestRemoteExecWrites_ACLToken
=== PAUSE TestRemoteExecWrites_ACLToken
=== RUN   TestRemoteExecWrites_ACLAgentToken
=== PAUSE TestRemoteExecWrites_ACLAgentToken
=== RUN   TestRemoteExecWrites_ACLDeny
=== PAUSE TestRemoteExecWrites_ACLDeny
=== RUN   TestHandleRemoteExec
=== PAUSE TestHandleRemoteExec
=== RUN   TestHandleRemoteExecFailed
=== PAUSE TestHandleRemoteExecFailed
=== RUN   TestServiceManager_RegisterService
WARNING: bootstrap = true: do not enable unless necessary
TestServiceManager_RegisterService - 2019/12/06 06:02:11.268885 [WARN] agent: Node name "Node ecee2813-8760-2ead-c529-560d147ce016" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestServiceManager_RegisterService - 2019/12/06 06:02:11.269529 [DEBUG] tlsutil: Update with version 1
TestServiceManager_RegisterService - 2019/12/06 06:02:11.271959 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ecee2813-8760-2ead-c529-560d147ce016 Address:127.0.0.1:34162}]
2019/12/06 06:02:12 [INFO]  raft: Node at 127.0.0.1:34162 [Follower] entering Follower state (Leader: "")
TestServiceManager_RegisterService - 2019/12/06 06:02:12.338807 [INFO] serf: EventMemberJoin: Node ecee2813-8760-2ead-c529-560d147ce016.dc1 127.0.0.1
TestServiceManager_RegisterService - 2019/12/06 06:02:12.346196 [INFO] serf: EventMemberJoin: Node ecee2813-8760-2ead-c529-560d147ce016 127.0.0.1
TestServiceManager_RegisterService - 2019/12/06 06:02:12.347281 [INFO] consul: Adding LAN server Node ecee2813-8760-2ead-c529-560d147ce016 (Addr: tcp/127.0.0.1:34162) (DC: dc1)
TestServiceManager_RegisterService - 2019/12/06 06:02:12.347331 [INFO] consul: Handled member-join event for server "Node ecee2813-8760-2ead-c529-560d147ce016.dc1" in area "wan"
TestServiceManager_RegisterService - 2019/12/06 06:02:12.348002 [INFO] agent: Started DNS server 127.0.0.1:34157 (udp)
TestServiceManager_RegisterService - 2019/12/06 06:02:12.348079 [INFO] agent: Started DNS server 127.0.0.1:34157 (tcp)
TestServiceManager_RegisterService - 2019/12/06 06:02:12.350469 [INFO] agent: Started HTTP server on 127.0.0.1:34158 (tcp)
TestServiceManager_RegisterService - 2019/12/06 06:02:12.350574 [INFO] agent: started state syncer
2019/12/06 06:02:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:12 [INFO]  raft: Node at 127.0.0.1:34162 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:13 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:13 [INFO]  raft: Node at 127.0.0.1:34162 [Leader] entering Leader state
TestServiceManager_RegisterService - 2019/12/06 06:02:13.044403 [INFO] consul: cluster leadership acquired
TestServiceManager_RegisterService - 2019/12/06 06:02:13.044791 [INFO] consul: New leader elected: Node ecee2813-8760-2ead-c529-560d147ce016
TestServiceManager_RegisterService - 2019/12/06 06:02:13.459746 [INFO] agent: Synced node info
TestServiceManager_RegisterService - 2019/12/06 06:02:14.552417 [INFO] agent: Requesting shutdown
TestServiceManager_RegisterService - 2019/12/06 06:02:14.552546 [INFO] consul: shutting down server
TestServiceManager_RegisterService - 2019/12/06 06:02:14.552594 [WARN] serf: Shutdown without a Leave
TestServiceManager_RegisterService - 2019/12/06 06:02:14.716995 [WARN] serf: Shutdown without a Leave
TestServiceManager_RegisterService - 2019/12/06 06:02:15.583839 [INFO] manager: shutting down
TestServiceManager_RegisterService - 2019/12/06 06:02:15.600126 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestServiceManager_RegisterService - 2019/12/06 06:02:15.600499 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestServiceManager_RegisterService - 2019/12/06 06:02:15.733790 [WARN] agent: Syncing service "redis" failed. leadership lost while committing log
TestServiceManager_RegisterService - 2019/12/06 06:02:15.733878 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestServiceManager_RegisterService - 2019/12/06 06:02:15.733991 [INFO] agent: consul server down
TestServiceManager_RegisterService - 2019/12/06 06:02:15.734034 [INFO] agent: shutdown complete
TestServiceManager_RegisterService - 2019/12/06 06:02:15.734082 [INFO] agent: Stopping DNS server 127.0.0.1:34157 (tcp)
TestServiceManager_RegisterService - 2019/12/06 06:02:15.734284 [INFO] agent: Stopping DNS server 127.0.0.1:34157 (udp)
TestServiceManager_RegisterService - 2019/12/06 06:02:15.734462 [INFO] agent: Stopping HTTP server 127.0.0.1:34158 (tcp)
TestServiceManager_RegisterService - 2019/12/06 06:02:15.734671 [INFO] agent: Waiting for endpoints to shut down
TestServiceManager_RegisterService - 2019/12/06 06:02:15.734740 [INFO] agent: Endpoints down
--- PASS: TestServiceManager_RegisterService (4.53s)
=== RUN   TestServiceManager_RegisterSidecar
WARNING: bootstrap = true: do not enable unless necessary
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:15.795427 [WARN] agent: Node name "Node 169c1301-f6bb-024e-2e18-515c1893fb23" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:15.795875 [DEBUG] tlsutil: Update with version 1
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:15.798063 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:169c1301-f6bb-024e-2e18-515c1893fb23 Address:127.0.0.1:34168}]
2019/12/06 06:02:16 [INFO]  raft: Node at 127.0.0.1:34168 [Follower] entering Follower state (Leader: "")
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:16.546161 [INFO] serf: EventMemberJoin: Node 169c1301-f6bb-024e-2e18-515c1893fb23.dc1 127.0.0.1
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:16.549471 [INFO] serf: EventMemberJoin: Node 169c1301-f6bb-024e-2e18-515c1893fb23 127.0.0.1
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:16.550394 [INFO] consul: Adding LAN server Node 169c1301-f6bb-024e-2e18-515c1893fb23 (Addr: tcp/127.0.0.1:34168) (DC: dc1)
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:16.550715 [INFO] consul: Handled member-join event for server "Node 169c1301-f6bb-024e-2e18-515c1893fb23.dc1" in area "wan"
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:16.550792 [INFO] agent: Started DNS server 127.0.0.1:34163 (udp)
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:16.551136 [INFO] agent: Started DNS server 127.0.0.1:34163 (tcp)
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:16.553466 [INFO] agent: Started HTTP server on 127.0.0.1:34164 (tcp)
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:16.553585 [INFO] agent: started state syncer
2019/12/06 06:02:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:16 [INFO]  raft: Node at 127.0.0.1:34168 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:17 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:17 [INFO]  raft: Node at 127.0.0.1:34168 [Leader] entering Leader state
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:17.034110 [INFO] consul: cluster leadership acquired
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:17.034598 [INFO] consul: New leader elected: Node 169c1301-f6bb-024e-2e18-515c1893fb23
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:17.351309 [INFO] agent: Synced node info
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.720607 [DEBUG] agent.manager: added local registration for service "web-sidecar-proxy"
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.721044 [INFO] agent: Requesting shutdown
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.721290 [INFO] consul: shutting down server
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.721362 [WARN] serf: Shutdown without a Leave
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.817099 [WARN] serf: Shutdown without a Leave
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.900432 [INFO] manager: shutting down
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.901084 [INFO] agent: consul server down
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.901142 [INFO] agent: shutdown complete
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.901205 [INFO] agent: Stopping DNS server 127.0.0.1:34163 (tcp)
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.901369 [INFO] agent: Stopping DNS server 127.0.0.1:34163 (udp)
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.901540 [INFO] agent: Stopping HTTP server 127.0.0.1:34164 (tcp)
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.901756 [INFO] agent: Waiting for endpoints to shut down
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.901832 [INFO] agent: Endpoints down
--- PASS: TestServiceManager_RegisterSidecar (3.17s)
=== RUN   TestServiceManager_Disabled
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.902526 [ERR] connect: Apply failed leadership lost while committing log
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.902612 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.902845 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestServiceManager_RegisterSidecar - 2019/12/06 06:02:18.902919 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestServiceManager_Disabled - 2019/12/06 06:02:18.967786 [WARN] agent: Node name "Node d177381f-73a4-2df8-e7d9-3ca29004ba96" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestServiceManager_Disabled - 2019/12/06 06:02:18.968656 [DEBUG] tlsutil: Update with version 1
TestServiceManager_Disabled - 2019/12/06 06:02:18.971246 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d177381f-73a4-2df8-e7d9-3ca29004ba96 Address:127.0.0.1:34174}]
2019/12/06 06:02:20 [INFO]  raft: Node at 127.0.0.1:34174 [Follower] entering Follower state (Leader: "")
TestServiceManager_Disabled - 2019/12/06 06:02:20.282802 [INFO] serf: EventMemberJoin: Node d177381f-73a4-2df8-e7d9-3ca29004ba96.dc1 127.0.0.1
TestServiceManager_Disabled - 2019/12/06 06:02:20.287407 [INFO] serf: EventMemberJoin: Node d177381f-73a4-2df8-e7d9-3ca29004ba96 127.0.0.1
TestServiceManager_Disabled - 2019/12/06 06:02:20.288907 [INFO] consul: Adding LAN server Node d177381f-73a4-2df8-e7d9-3ca29004ba96 (Addr: tcp/127.0.0.1:34174) (DC: dc1)
TestServiceManager_Disabled - 2019/12/06 06:02:20.289497 [INFO] consul: Handled member-join event for server "Node d177381f-73a4-2df8-e7d9-3ca29004ba96.dc1" in area "wan"
TestServiceManager_Disabled - 2019/12/06 06:02:20.291082 [INFO] agent: Started DNS server 127.0.0.1:34169 (tcp)
TestServiceManager_Disabled - 2019/12/06 06:02:20.291584 [INFO] agent: Started DNS server 127.0.0.1:34169 (udp)
TestServiceManager_Disabled - 2019/12/06 06:02:20.294151 [INFO] agent: Started HTTP server on 127.0.0.1:34170 (tcp)
TestServiceManager_Disabled - 2019/12/06 06:02:20.294334 [INFO] agent: started state syncer
2019/12/06 06:02:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:20 [INFO]  raft: Node at 127.0.0.1:34174 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:20 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:20 [INFO]  raft: Node at 127.0.0.1:34174 [Leader] entering Leader state
TestServiceManager_Disabled - 2019/12/06 06:02:20.793568 [INFO] consul: cluster leadership acquired
TestServiceManager_Disabled - 2019/12/06 06:02:20.794037 [INFO] consul: New leader elected: Node d177381f-73a4-2df8-e7d9-3ca29004ba96
TestServiceManager_Disabled - 2019/12/06 06:02:21.092929 [INFO] agent: Synced node info
TestServiceManager_Disabled - 2019/12/06 06:02:21.093059 [DEBUG] agent: Node info in sync
TestServiceManager_Disabled - 2019/12/06 06:02:21.783805 [DEBUG] agent: Node info in sync
TestServiceManager_Disabled - 2019/12/06 06:02:22.211925 [INFO] agent: Requesting shutdown
TestServiceManager_Disabled - 2019/12/06 06:02:22.212045 [INFO] consul: shutting down server
TestServiceManager_Disabled - 2019/12/06 06:02:22.212105 [WARN] serf: Shutdown without a Leave
TestServiceManager_Disabled - 2019/12/06 06:02:22.671629 [WARN] serf: Shutdown without a Leave
TestServiceManager_Disabled - 2019/12/06 06:02:22.858831 [INFO] manager: shutting down
TestServiceManager_Disabled - 2019/12/06 06:02:23.401785 [ERR] autopilot: Error updating cluster health: error getting Raft configuration raft is already shutdown
TestServiceManager_Disabled - 2019/12/06 06:02:23.546822 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestServiceManager_Disabled - 2019/12/06 06:02:23.546895 [WARN] agent: Syncing service "web-sidecar-proxy" failed. leadership lost while committing log
TestServiceManager_Disabled - 2019/12/06 06:02:23.546961 [ERR] agent: failed to sync changes: leadership lost while committing log
TestServiceManager_Disabled - 2019/12/06 06:02:23.547214 [WARN] agent: Syncing service "web-sidecar-proxy" failed. No cluster leader
TestServiceManager_Disabled - 2019/12/06 06:02:23.547272 [ERR] agent: failed to sync changes: No cluster leader
TestServiceManager_Disabled - 2019/12/06 06:02:23.547237 [INFO] agent: consul server down
TestServiceManager_Disabled - 2019/12/06 06:02:23.547487 [INFO] agent: shutdown complete
TestServiceManager_Disabled - 2019/12/06 06:02:23.547625 [INFO] agent: Stopping DNS server 127.0.0.1:34169 (tcp)
TestServiceManager_Disabled - 2019/12/06 06:02:23.547837 [INFO] agent: Stopping DNS server 127.0.0.1:34169 (udp)
TestServiceManager_Disabled - 2019/12/06 06:02:23.548079 [INFO] agent: Stopping HTTP server 127.0.0.1:34170 (tcp)
TestServiceManager_Disabled - 2019/12/06 06:02:23.548352 [INFO] agent: Waiting for endpoints to shut down
TestServiceManager_Disabled - 2019/12/06 06:02:23.548505 [INFO] agent: Endpoints down
--- PASS: TestServiceManager_Disabled (4.65s)
=== RUN   TestSessionCreate
=== PAUSE TestSessionCreate
=== RUN   TestSessionCreate_Delete
=== PAUSE TestSessionCreate_Delete
=== RUN   TestSessionCreate_DefaultCheck
TestServiceManager_Disabled - 2019/12/06 06:02:23.549586 [ERR] roots watch error: invalid type for roots response: <nil>
=== PAUSE TestSessionCreate_DefaultCheck
=== RUN   TestSessionCreate_NoCheck
=== PAUSE TestSessionCreate_NoCheck
=== RUN   TestFixupLockDelay
=== PAUSE TestFixupLockDelay
=== RUN   TestSessionDestroy
TestServiceManager_Disabled - 2019/12/06 06:02:23.550544 [ERR] leaf watch error: invalid type for leaf response: <nil>
=== PAUSE TestSessionDestroy
=== RUN   TestSessionCustomTTL
=== PAUSE TestSessionCustomTTL
=== RUN   TestSessionTTLRenew
--- SKIP: TestSessionTTLRenew (0.00s)
    session_endpoint_test.go:372: DM-skipped
=== RUN   TestSessionGet
=== PAUSE TestSessionGet
=== RUN   TestSessionList
=== RUN   TestSessionList/#00
WARNING: bootstrap = true: do not enable unless necessary
TestSessionList/#00 - 2019/12/06 06:02:23.627196 [WARN] agent: Node name "Node 59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSessionList/#00 - 2019/12/06 06:02:23.627860 [DEBUG] tlsutil: Update with version 1
TestSessionList/#00 - 2019/12/06 06:02:23.630644 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2 Address:127.0.0.1:34180}]
2019/12/06 06:02:24 [INFO]  raft: Node at 127.0.0.1:34180 [Follower] entering Follower state (Leader: "")
TestSessionList/#00 - 2019/12/06 06:02:24.422216 [INFO] serf: EventMemberJoin: Node 59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2.dc1 127.0.0.1
TestSessionList/#00 - 2019/12/06 06:02:24.427737 [INFO] serf: EventMemberJoin: Node 59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2 127.0.0.1
TestSessionList/#00 - 2019/12/06 06:02:24.429484 [INFO] consul: Adding LAN server Node 59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2 (Addr: tcp/127.0.0.1:34180) (DC: dc1)
TestSessionList/#00 - 2019/12/06 06:02:24.430391 [INFO] consul: Handled member-join event for server "Node 59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2.dc1" in area "wan"
TestSessionList/#00 - 2019/12/06 06:02:24.432003 [INFO] agent: Started DNS server 127.0.0.1:34175 (tcp)
TestSessionList/#00 - 2019/12/06 06:02:24.433047 [INFO] agent: Started DNS server 127.0.0.1:34175 (udp)
TestSessionList/#00 - 2019/12/06 06:02:24.435512 [INFO] agent: Started HTTP server on 127.0.0.1:34176 (tcp)
TestSessionList/#00 - 2019/12/06 06:02:24.435637 [INFO] agent: started state syncer
2019/12/06 06:02:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:24 [INFO]  raft: Node at 127.0.0.1:34180 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:24 [INFO]  raft: Node at 127.0.0.1:34180 [Leader] entering Leader state
TestSessionList/#00 - 2019/12/06 06:02:24.935229 [INFO] consul: cluster leadership acquired
TestSessionList/#00 - 2019/12/06 06:02:24.935722 [INFO] consul: New leader elected: Node 59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2
TestSessionList/#00 - 2019/12/06 06:02:25.562016 [INFO] agent: Synced node info
TestSessionList/#00 - 2019/12/06 06:02:25.764968 [DEBUG] agent: Node info in sync
TestSessionList/#00 - 2019/12/06 06:02:25.765087 [DEBUG] agent: Node info in sync
TestSessionList/#00 - 2019/12/06 06:02:26.292788 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSessionList/#00 - 2019/12/06 06:02:26.293237 [DEBUG] consul: Skipping self join check for "Node 59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2" since the cluster is too small
TestSessionList/#00 - 2019/12/06 06:02:26.293372 [INFO] consul: member 'Node 59406d61-07ec-6a6b-0c02-4a5ebc6ef0f2' joined, marking health alive
TestSessionList/#00 - 2019/12/06 06:02:26.493844 [INFO] agent: Requesting shutdown
TestSessionList/#00 - 2019/12/06 06:02:26.493951 [INFO] consul: shutting down server
TestSessionList/#00 - 2019/12/06 06:02:26.494005 [WARN] serf: Shutdown without a Leave
TestSessionList/#00 - 2019/12/06 06:02:26.550513 [WARN] serf: Shutdown without a Leave
TestSessionList/#00 - 2019/12/06 06:02:26.608920 [INFO] manager: shutting down
TestSessionList/#00 - 2019/12/06 06:02:26.609417 [INFO] agent: consul server down
TestSessionList/#00 - 2019/12/06 06:02:26.609475 [INFO] agent: shutdown complete
TestSessionList/#00 - 2019/12/06 06:02:26.609533 [INFO] agent: Stopping DNS server 127.0.0.1:34175 (tcp)
TestSessionList/#00 - 2019/12/06 06:02:26.609665 [INFO] agent: Stopping DNS server 127.0.0.1:34175 (udp)
TestSessionList/#00 - 2019/12/06 06:02:26.609817 [INFO] agent: Stopping HTTP server 127.0.0.1:34176 (tcp)
TestSessionList/#00 - 2019/12/06 06:02:26.610050 [INFO] agent: Waiting for endpoints to shut down
TestSessionList/#00 - 2019/12/06 06:02:26.610119 [INFO] agent: Endpoints down
=== RUN   TestSessionList/#01
WARNING: bootstrap = true: do not enable unless necessary
TestSessionList/#01 - 2019/12/06 06:02:26.671927 [WARN] agent: Node name "Node 1e73eb14-bf39-306f-a1cb-48877443e289" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSessionList/#01 - 2019/12/06 06:02:26.672571 [DEBUG] tlsutil: Update with version 1
TestSessionList/#01 - 2019/12/06 06:02:26.674985 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1e73eb14-bf39-306f-a1cb-48877443e289 Address:127.0.0.1:34186}]
2019/12/06 06:02:27 [INFO]  raft: Node at 127.0.0.1:34186 [Follower] entering Follower state (Leader: "")
TestSessionList/#01 - 2019/12/06 06:02:27.472600 [INFO] serf: EventMemberJoin: Node 1e73eb14-bf39-306f-a1cb-48877443e289.dc1 127.0.0.1
TestSessionList/#01 - 2019/12/06 06:02:27.478162 [INFO] serf: EventMemberJoin: Node 1e73eb14-bf39-306f-a1cb-48877443e289 127.0.0.1
TestSessionList/#01 - 2019/12/06 06:02:27.479860 [INFO] consul: Adding LAN server Node 1e73eb14-bf39-306f-a1cb-48877443e289 (Addr: tcp/127.0.0.1:34186) (DC: dc1)
TestSessionList/#01 - 2019/12/06 06:02:27.480601 [INFO] consul: Handled member-join event for server "Node 1e73eb14-bf39-306f-a1cb-48877443e289.dc1" in area "wan"
TestSessionList/#01 - 2019/12/06 06:02:27.482098 [INFO] agent: Started DNS server 127.0.0.1:34181 (tcp)
TestSessionList/#01 - 2019/12/06 06:02:27.482684 [INFO] agent: Started DNS server 127.0.0.1:34181 (udp)
TestSessionList/#01 - 2019/12/06 06:02:27.498500 [INFO] agent: Started HTTP server on 127.0.0.1:34182 (tcp)
TestSessionList/#01 - 2019/12/06 06:02:27.498742 [INFO] agent: started state syncer
2019/12/06 06:02:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:27 [INFO]  raft: Node at 127.0.0.1:34186 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:28 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:28 [INFO]  raft: Node at 127.0.0.1:34186 [Leader] entering Leader state
TestSessionList/#01 - 2019/12/06 06:02:28.000997 [INFO] consul: cluster leadership acquired
TestSessionList/#01 - 2019/12/06 06:02:28.001452 [INFO] consul: New leader elected: Node 1e73eb14-bf39-306f-a1cb-48877443e289
TestSessionList/#01 - 2019/12/06 06:02:28.317979 [INFO] agent: Synced node info
TestSessionList/#01 - 2019/12/06 06:02:28.318094 [DEBUG] agent: Node info in sync
TestSessionList/#01 - 2019/12/06 06:02:29.201815 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSessionList/#01 - 2019/12/06 06:02:29.202586 [DEBUG] consul: Skipping self join check for "Node 1e73eb14-bf39-306f-a1cb-48877443e289" since the cluster is too small
TestSessionList/#01 - 2019/12/06 06:02:29.202760 [INFO] consul: member 'Node 1e73eb14-bf39-306f-a1cb-48877443e289' joined, marking health alive
TestSessionList/#01 - 2019/12/06 06:02:29.994653 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestSessionList/#01 - 2019/12/06 06:02:29.994745 [DEBUG] agent: Node info in sync
TestSessionList/#01 - 2019/12/06 06:02:30.692636 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestSessionList/#01 - 2019/12/06 06:02:31.427078 [INFO] agent: Requesting shutdown
TestSessionList/#01 - 2019/12/06 06:02:31.427205 [INFO] consul: shutting down server
TestSessionList/#01 - 2019/12/06 06:02:31.427261 [WARN] serf: Shutdown without a Leave
TestSessionList/#01 - 2019/12/06 06:02:31.508860 [WARN] serf: Shutdown without a Leave
TestSessionList/#01 - 2019/12/06 06:02:31.567463 [INFO] manager: shutting down
TestSessionList/#01 - 2019/12/06 06:02:31.570078 [INFO] agent: consul server down
TestSessionList/#01 - 2019/12/06 06:02:31.570164 [INFO] agent: shutdown complete
TestSessionList/#01 - 2019/12/06 06:02:31.570230 [INFO] agent: Stopping DNS server 127.0.0.1:34181 (tcp)
TestSessionList/#01 - 2019/12/06 06:02:31.570376 [INFO] agent: Stopping DNS server 127.0.0.1:34181 (udp)
TestSessionList/#01 - 2019/12/06 06:02:31.570528 [INFO] agent: Stopping HTTP server 127.0.0.1:34182 (tcp)
TestSessionList/#01 - 2019/12/06 06:02:31.570713 [INFO] agent: Waiting for endpoints to shut down
TestSessionList/#01 - 2019/12/06 06:02:31.570774 [INFO] agent: Endpoints down
--- PASS: TestSessionList (8.02s)
    --- PASS: TestSessionList/#00 (3.06s)
    --- PASS: TestSessionList/#01 (4.96s)
=== RUN   TestSessionsForNode
=== PAUSE TestSessionsForNode
=== RUN   TestSessionDeleteDestroy
=== PAUSE TestSessionDeleteDestroy
=== RUN   TestAgent_sidecarServiceFromNodeService
=== RUN   TestAgent_sidecarServiceFromNodeService/no_sidecar
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:31.635542 [WARN] agent: Node name "Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:31.636253 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:31.638884 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a4d824d2-6dd5-669c-d5c6-f0df82645ad2 Address:127.0.0.1:34192}]
2019/12/06 06:02:33 [INFO]  raft: Node at 127.0.0.1:34192 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:33.188418 [INFO] serf: EventMemberJoin: Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2.dc1 127.0.0.1
jones - 2019/12/06 06:02:33.191895 [INFO] serf: EventMemberJoin: Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2 127.0.0.1
jones - 2019/12/06 06:02:33.192940 [INFO] consul: Handled member-join event for server "Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2.dc1" in area "wan"
jones - 2019/12/06 06:02:33.193150 [INFO] consul: Adding LAN server Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2 (Addr: tcp/127.0.0.1:34192) (DC: dc1)
jones - 2019/12/06 06:02:33.194277 [INFO] agent: Started DNS server 127.0.0.1:34187 (tcp)
jones - 2019/12/06 06:02:33.194370 [INFO] agent: Started DNS server 127.0.0.1:34187 (udp)
jones - 2019/12/06 06:02:33.196767 [INFO] agent: Started HTTP server on 127.0.0.1:34188 (tcp)
jones - 2019/12/06 06:02:33.196857 [INFO] agent: started state syncer
2019/12/06 06:02:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:33 [INFO]  raft: Node at 127.0.0.1:34192 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:34 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:34 [INFO]  raft: Node at 127.0.0.1:34192 [Leader] entering Leader state
jones - 2019/12/06 06:02:34.142702 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:34.143163 [INFO] consul: New leader elected: Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2
=== RUN   TestAgent_sidecarServiceFromNodeService/all_the_defaults
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:34.558617 [WARN] agent: Node name "Node 69e40561-243a-545f-340c-f7edd80028d7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:34.559446 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:34.567116 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:34.676595 [INFO] agent: Synced node info
jones - 2019/12/06 06:02:34.676718 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:34.871137 [DEBUG] agent: Node info in sync
2019/12/06 06:02:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:69e40561-243a-545f-340c-f7edd80028d7 Address:127.0.0.1:34198}]
2019/12/06 06:02:35 [INFO]  raft: Node at 127.0.0.1:34198 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:35.788100 [INFO] serf: EventMemberJoin: Node 69e40561-243a-545f-340c-f7edd80028d7.dc1 127.0.0.1
jones - 2019/12/06 06:02:35.797328 [INFO] serf: EventMemberJoin: Node 69e40561-243a-545f-340c-f7edd80028d7 127.0.0.1
jones - 2019/12/06 06:02:35.798339 [INFO] consul: Adding LAN server Node 69e40561-243a-545f-340c-f7edd80028d7 (Addr: tcp/127.0.0.1:34198) (DC: dc1)
jones - 2019/12/06 06:02:35.798905 [INFO] consul: Handled member-join event for server "Node 69e40561-243a-545f-340c-f7edd80028d7.dc1" in area "wan"
jones - 2019/12/06 06:02:35.800604 [INFO] agent: Started DNS server 127.0.0.1:34193 (tcp)
jones - 2019/12/06 06:02:35.800961 [INFO] agent: Started DNS server 127.0.0.1:34193 (udp)
jones - 2019/12/06 06:02:35.803382 [INFO] agent: Started HTTP server on 127.0.0.1:34194 (tcp)
jones - 2019/12/06 06:02:35.803618 [INFO] agent: started state syncer
2019/12/06 06:02:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:35 [INFO]  raft: Node at 127.0.0.1:34198 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:36 [INFO]  raft: Node at 127.0.0.1:34198 [Leader] entering Leader state
jones - 2019/12/06 06:02:36.766079 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:36.766591 [DEBUG] consul: Skipping self join check for "Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2" since the cluster is too small
jones - 2019/12/06 06:02:36.766778 [INFO] consul: member 'Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2' joined, marking health alive
jones - 2019/12/06 06:02:36.766982 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:36.767389 [INFO] consul: New leader elected: Node 69e40561-243a-545f-340c-f7edd80028d7
jones - 2019/12/06 06:02:37.243047 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:38.160329 [INFO] agent: Synced node info
=== RUN   TestAgent_sidecarServiceFromNodeService/all_the_allowed_overrides
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:38.226787 [WARN] agent: Node name "Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:38.227193 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:38.229344 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e7ff374a-e051-0dd1-c5b8-1cb2efc19141 Address:127.0.0.1:34204}]
2019/12/06 06:02:39 [INFO]  raft: Node at 127.0.0.1:34204 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:39.563295 [INFO] serf: EventMemberJoin: Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141.dc1 127.0.0.1
jones - 2019/12/06 06:02:39.566999 [INFO] serf: EventMemberJoin: Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141 127.0.0.1
jones - 2019/12/06 06:02:39.568637 [INFO] consul: Adding LAN server Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141 (Addr: tcp/127.0.0.1:34204) (DC: dc1)
jones - 2019/12/06 06:02:39.569069 [INFO] consul: Handled member-join event for server "Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141.dc1" in area "wan"
jones - 2019/12/06 06:02:39.570662 [INFO] agent: Started DNS server 127.0.0.1:34199 (tcp)
jones - 2019/12/06 06:02:39.571354 [INFO] agent: Started DNS server 127.0.0.1:34199 (udp)
jones - 2019/12/06 06:02:39.573876 [INFO] agent: Started HTTP server on 127.0.0.1:34200 (tcp)
jones - 2019/12/06 06:02:39.574006 [INFO] agent: started state syncer
2019/12/06 06:02:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:39 [INFO]  raft: Node at 127.0.0.1:34204 [Candidate] entering Candidate state in term 2
jones - 2019/12/06 06:02:39.635176 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:39.635727 [DEBUG] consul: Skipping self join check for "Node 69e40561-243a-545f-340c-f7edd80028d7" since the cluster is too small
jones - 2019/12/06 06:02:39.635991 [INFO] consul: member 'Node 69e40561-243a-545f-340c-f7edd80028d7' joined, marking health alive
2019/12/06 06:02:40 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:40 [INFO]  raft: Node at 127.0.0.1:34204 [Leader] entering Leader state
jones - 2019/12/06 06:02:40.293291 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:40.293780 [INFO] consul: New leader elected: Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141
=== RUN   TestAgent_sidecarServiceFromNodeService/no_auto_ports_available
jones - 2019/12/06 06:02:40.405850 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:02:40.405922 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:40.406005 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:40.438522 [WARN] agent: Node name "Node 7d555e7b-b226-ede2-fc13-7639f5dd2636" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:40.439131 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:40.441460 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:40.611222 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:40.760547 [INFO] agent: Synced node info
jones - 2019/12/06 06:02:40.760666 [DEBUG] agent: Node info in sync
2019/12/06 06:02:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7d555e7b-b226-ede2-fc13-7639f5dd2636 Address:127.0.0.1:34210}]
2019/12/06 06:02:41 [INFO]  raft: Node at 127.0.0.1:34210 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:41.479884 [INFO] serf: EventMemberJoin: Node 7d555e7b-b226-ede2-fc13-7639f5dd2636.dc1 127.0.0.1
jones - 2019/12/06 06:02:41.483230 [INFO] serf: EventMemberJoin: Node 7d555e7b-b226-ede2-fc13-7639f5dd2636 127.0.0.1
jones - 2019/12/06 06:02:41.483911 [INFO] consul: Adding LAN server Node 7d555e7b-b226-ede2-fc13-7639f5dd2636 (Addr: tcp/127.0.0.1:34210) (DC: dc1)
jones - 2019/12/06 06:02:41.483963 [INFO] consul: Handled member-join event for server "Node 7d555e7b-b226-ede2-fc13-7639f5dd2636.dc1" in area "wan"
jones - 2019/12/06 06:02:41.484594 [INFO] agent: Started DNS server 127.0.0.1:34205 (tcp)
jones - 2019/12/06 06:02:41.484675 [INFO] agent: Started DNS server 127.0.0.1:34205 (udp)
jones - 2019/12/06 06:02:41.487282 [INFO] agent: Started HTTP server on 127.0.0.1:34206 (tcp)
jones - 2019/12/06 06:02:41.487553 [INFO] agent: started state syncer
2019/12/06 06:02:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:41 [INFO]  raft: Node at 127.0.0.1:34210 [Candidate] entering Candidate state in term 2
jones - 2019/12/06 06:02:42.151354 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:42.151842 [DEBUG] consul: Skipping self join check for "Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141" since the cluster is too small
jones - 2019/12/06 06:02:42.151997 [INFO] consul: member 'Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141' joined, marking health alive
2019/12/06 06:02:42 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:42 [INFO]  raft: Node at 127.0.0.1:34210 [Leader] entering Leader state
jones - 2019/12/06 06:02:42.154820 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:42.155219 [INFO] consul: New leader elected: Node 7d555e7b-b226-ede2-fc13-7639f5dd2636
jones - 2019/12/06 06:02:42.526520 [INFO] agent: Synced node info
=== RUN   TestAgent_sidecarServiceFromNodeService/auto_ports_disabled
jones - 2019/12/06 06:02:42.530643 [ERR] leaf watch error: invalid type for leaf response: <nil>
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:42.656870 [WARN] agent: Node name "Node a587e71c-195b-52ca-e2b1-0bac5467c444" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:42.657274 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:42.664587 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:42.918824 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:43.510730 [INFO] agent: Synced service "api-proxy-sidecar"
jones - 2019/12/06 06:02:43.510844 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:43.510973 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/12/06 06:02:43.511044 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:43.716681 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:02:43.716774 [DEBUG] agent: Node info in sync
2019/12/06 06:02:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a587e71c-195b-52ca-e2b1-0bac5467c444 Address:127.0.0.1:34216}]
2019/12/06 06:02:43 [INFO]  raft: Node at 127.0.0.1:34216 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:43.938576 [INFO] serf: EventMemberJoin: Node a587e71c-195b-52ca-e2b1-0bac5467c444.dc1 127.0.0.1
jones - 2019/12/06 06:02:43.942269 [INFO] serf: EventMemberJoin: Node a587e71c-195b-52ca-e2b1-0bac5467c444 127.0.0.1
jones - 2019/12/06 06:02:43.943242 [INFO] consul: Handled member-join event for server "Node a587e71c-195b-52ca-e2b1-0bac5467c444.dc1" in area "wan"
jones - 2019/12/06 06:02:43.943529 [INFO] consul: Adding LAN server Node a587e71c-195b-52ca-e2b1-0bac5467c444 (Addr: tcp/127.0.0.1:34216) (DC: dc1)
jones - 2019/12/06 06:02:43.943973 [INFO] agent: Started DNS server 127.0.0.1:34211 (udp)
jones - 2019/12/06 06:02:43.944167 [INFO] agent: Started DNS server 127.0.0.1:34211 (tcp)
jones - 2019/12/06 06:02:43.946738 [INFO] agent: Started HTTP server on 127.0.0.1:34212 (tcp)
jones - 2019/12/06 06:02:43.946845 [INFO] agent: started state syncer
2019/12/06 06:02:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:44 [INFO]  raft: Node at 127.0.0.1:34216 [Candidate] entering Candidate state in term 2
jones - 2019/12/06 06:02:44.276742 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:44.278840 [DEBUG] consul: Skipping self join check for "Node 7d555e7b-b226-ede2-fc13-7639f5dd2636" since the cluster is too small
jones - 2019/12/06 06:02:44.279535 [INFO] consul: member 'Node 7d555e7b-b226-ede2-fc13-7639f5dd2636' joined, marking health alive
jones - 2019/12/06 06:02:44.668573 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:44 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:44 [INFO]  raft: Node at 127.0.0.1:34216 [Leader] entering Leader state
jones - 2019/12/06 06:02:44.934573 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:44.937135 [INFO] consul: New leader elected: Node a587e71c-195b-52ca-e2b1-0bac5467c444
=== RUN   TestAgent_sidecarServiceFromNodeService/inherit_tags_and_meta
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:45.040304 [WARN] agent: Node name "Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:45.041149 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:45.044253 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:45.985334 [INFO] agent: Synced node info
2019/12/06 06:02:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5b00a3f9-83b7-6ff9-5316-c7daa24e44b4 Address:127.0.0.1:34222}]
2019/12/06 06:02:46 [INFO]  raft: Node at 127.0.0.1:34222 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:46.639851 [INFO] serf: EventMemberJoin: Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4.dc1 127.0.0.1
jones - 2019/12/06 06:02:46.643372 [INFO] serf: EventMemberJoin: Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4 127.0.0.1
jones - 2019/12/06 06:02:46.643980 [INFO] consul: Handled member-join event for server "Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4.dc1" in area "wan"
jones - 2019/12/06 06:02:46.644326 [INFO] consul: Adding LAN server Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4 (Addr: tcp/127.0.0.1:34222) (DC: dc1)
jones - 2019/12/06 06:02:46.644685 [INFO] agent: Started DNS server 127.0.0.1:34217 (tcp)
jones - 2019/12/06 06:02:46.644754 [INFO] agent: Started DNS server 127.0.0.1:34217 (udp)
jones - 2019/12/06 06:02:46.647075 [INFO] agent: Started HTTP server on 127.0.0.1:34218 (tcp)
jones - 2019/12/06 06:02:46.647178 [INFO] agent: started state syncer
2019/12/06 06:02:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:46 [INFO]  raft: Node at 127.0.0.1:34222 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:47 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:47 [INFO]  raft: Node at 127.0.0.1:34222 [Leader] entering Leader state
jones - 2019/12/06 06:02:47.285494 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:47.285852 [INFO] consul: New leader elected: Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4
jones - 2019/12/06 06:02:47.473613 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:47.473720 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:47.545165 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:47.545644 [DEBUG] consul: Skipping self join check for "Node a587e71c-195b-52ca-e2b1-0bac5467c444" since the cluster is too small
jones - 2019/12/06 06:02:47.545796 [INFO] consul: member 'Node a587e71c-195b-52ca-e2b1-0bac5467c444' joined, marking health alive
jones - 2019/12/06 06:02:47.885674 [INFO] agent: Synced node info
=== RUN   TestAgent_sidecarServiceFromNodeService/invalid_check_type
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:47.986759 [WARN] agent: Node name "Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:47.987559 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:47.989977 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:48.297372 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:48.334091 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:48.334259 [DEBUG] agent: Node info in sync
2019/12/06 06:02:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e Address:127.0.0.1:34228}]
2019/12/06 06:02:50 [INFO]  raft: Node at 127.0.0.1:34228 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:50.148221 [INFO] serf: EventMemberJoin: Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e.dc1 127.0.0.1
jones - 2019/12/06 06:02:50.152589 [INFO] serf: EventMemberJoin: Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e 127.0.0.1
jones - 2019/12/06 06:02:50.153582 [INFO] consul: Adding LAN server Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e (Addr: tcp/127.0.0.1:34228) (DC: dc1)
jones - 2019/12/06 06:02:50.153893 [INFO] consul: Handled member-join event for server "Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e.dc1" in area "wan"
jones - 2019/12/06 06:02:50.154287 [INFO] agent: Started DNS server 127.0.0.1:34223 (udp)
jones - 2019/12/06 06:02:50.155026 [INFO] agent: Started DNS server 127.0.0.1:34223 (tcp)
jones - 2019/12/06 06:02:50.157472 [INFO] agent: Started HTTP server on 127.0.0.1:34224 (tcp)
jones - 2019/12/06 06:02:50.157593 [INFO] agent: started state syncer
2019/12/06 06:02:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:50 [INFO]  raft: Node at 127.0.0.1:34228 [Candidate] entering Candidate state in term 2
jones - 2019/12/06 06:02:50.235695 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:50.236117 [DEBUG] consul: Skipping self join check for "Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4" since the cluster is too small
jones - 2019/12/06 06:02:50.236260 [INFO] consul: member 'Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4' joined, marking health alive
2019/12/06 06:02:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:50 [INFO]  raft: Node at 127.0.0.1:34228 [Leader] entering Leader state
jones - 2019/12/06 06:02:50.935831 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:50.936562 [INFO] consul: New leader elected: Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e
jones - 2019/12/06 06:02:51.203976 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:51.403338 [INFO] agent: Synced node info
=== RUN   TestAgent_sidecarServiceFromNodeService/invalid_meta
jones - 2019/12/06 06:02:51.404851 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:51.476380 [WARN] agent: Node name "Node 05379951-a361-1a11-bcaf-fc25c0b4fc85" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:51.476797 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:51.479431 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:05379951-a361-1a11-bcaf-fc25c0b4fc85 Address:127.0.0.1:34234}]
2019/12/06 06:02:52 [INFO]  raft: Node at 127.0.0.1:34234 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:52.605827 [INFO] serf: EventMemberJoin: Node 05379951-a361-1a11-bcaf-fc25c0b4fc85.dc1 127.0.0.1
jones - 2019/12/06 06:02:52.617925 [INFO] serf: EventMemberJoin: Node 05379951-a361-1a11-bcaf-fc25c0b4fc85 127.0.0.1
jones - 2019/12/06 06:02:52.619580 [INFO] agent: Started DNS server 127.0.0.1:34229 (udp)
jones - 2019/12/06 06:02:52.620924 [INFO] agent: Started DNS server 127.0.0.1:34229 (tcp)
jones - 2019/12/06 06:02:52.619652 [INFO] consul: Adding LAN server Node 05379951-a361-1a11-bcaf-fc25c0b4fc85 (Addr: tcp/127.0.0.1:34234) (DC: dc1)
jones - 2019/12/06 06:02:52.619827 [INFO] consul: Handled member-join event for server "Node 05379951-a361-1a11-bcaf-fc25c0b4fc85.dc1" in area "wan"
jones - 2019/12/06 06:02:52.623951 [INFO] agent: Started HTTP server on 127.0.0.1:34230 (tcp)
jones - 2019/12/06 06:02:52.624050 [INFO] agent: started state syncer
2019/12/06 06:02:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:52 [INFO]  raft: Node at 127.0.0.1:34234 [Candidate] entering Candidate state in term 2
jones - 2019/12/06 06:02:52.718311 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:52.718968 [DEBUG] consul: Skipping self join check for "Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e" since the cluster is too small
jones - 2019/12/06 06:02:52.719258 [INFO] consul: member 'Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e' joined, marking health alive
jones - 2019/12/06 06:02:52.759165 [DEBUG] agent: Node info in sync
2019/12/06 06:02:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:53 [INFO]  raft: Node at 127.0.0.1:34234 [Leader] entering Leader state
jones - 2019/12/06 06:02:53.493573 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:53.493996 [INFO] consul: New leader elected: Node 05379951-a361-1a11-bcaf-fc25c0b4fc85
jones - 2019/12/06 06:02:53.623564 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
=== RUN   TestAgent_sidecarServiceFromNodeService/re-registering_same_sidecar_with_no_port_should_pick_same_one
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/12/06 06:02:53.726091 [WARN] agent: Node name "Node 48766921-a98a-d876-447b-181b21701746" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/12/06 06:02:53.726550 [DEBUG] tlsutil: Update with version 1
jones - 2019/12/06 06:02:53.728638 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:02:53.901870 [INFO] agent: Synced node info
jones - 2019/12/06 06:02:54.009694 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:54.009821 [DEBUG] agent: Node info in sync
2019/12/06 06:02:55 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:48766921-a98a-d876-447b-181b21701746 Address:127.0.0.1:34240}]
jones - 2019/12/06 06:02:55.088356 [INFO] serf: EventMemberJoin: Node 48766921-a98a-d876-447b-181b21701746.dc1 127.0.0.1
2019/12/06 06:02:55 [INFO]  raft: Node at 127.0.0.1:34240 [Follower] entering Follower state (Leader: "")
jones - 2019/12/06 06:02:55.096330 [INFO] serf: EventMemberJoin: Node 48766921-a98a-d876-447b-181b21701746 127.0.0.1
jones - 2019/12/06 06:02:55.097999 [INFO] consul: Adding LAN server Node 48766921-a98a-d876-447b-181b21701746 (Addr: tcp/127.0.0.1:34240) (DC: dc1)
jones - 2019/12/06 06:02:55.098549 [INFO] consul: Handled member-join event for server "Node 48766921-a98a-d876-447b-181b21701746.dc1" in area "wan"
jones - 2019/12/06 06:02:55.099947 [INFO] agent: Started DNS server 127.0.0.1:34235 (tcp)
jones - 2019/12/06 06:02:55.100026 [INFO] agent: Started DNS server 127.0.0.1:34235 (udp)
jones - 2019/12/06 06:02:55.102236 [INFO] agent: Started HTTP server on 127.0.0.1:34236 (tcp)
jones - 2019/12/06 06:02:55.102316 [INFO] agent: started state syncer
2019/12/06 06:02:55 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:55 [INFO]  raft: Node at 127.0.0.1:34240 [Candidate] entering Candidate state in term 2
jones - 2019/12/06 06:02:55.360118 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:55.360587 [DEBUG] consul: Skipping self join check for "Node 05379951-a361-1a11-bcaf-fc25c0b4fc85" since the cluster is too small
jones - 2019/12/06 06:02:55.360737 [INFO] consul: member 'Node 05379951-a361-1a11-bcaf-fc25c0b4fc85' joined, marking health alive
2019/12/06 06:02:55 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:55 [INFO]  raft: Node at 127.0.0.1:34240 [Leader] entering Leader state
jones - 2019/12/06 06:02:55.962498 [INFO] consul: cluster leadership acquired
jones - 2019/12/06 06:02:55.962889 [INFO] consul: New leader elected: Node 48766921-a98a-d876-447b-181b21701746
--- PASS: TestAgent_sidecarServiceFromNodeService (24.40s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/no_sidecar (2.83s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/all_the_defaults (3.76s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/all_the_allowed_overrides (2.17s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/no_auto_ports_available (2.20s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/auto_ports_disabled (2.43s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/inherit_tags_and_meta (2.93s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/invalid_check_type (3.52s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/invalid_meta (2.26s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/re-registering_same_sidecar_with_no_port_should_pick_same_one (2.31s)
=== RUN   TestSnapshot
--- SKIP: TestSnapshot (0.00s)
    snapshot_endpoint_test.go:16: DM-skipped
=== RUN   TestSnapshot_Options
=== PAUSE TestSnapshot_Options
=== RUN   TestStatusLeader
--- SKIP: TestStatusLeader (0.00s)
    status_endpoint_test.go:11: DM-skipped
=== RUN   TestStatusPeers
=== PAUSE TestStatusPeers
=== RUN   TestDefaultConfig
=== RUN   TestDefaultConfig/#00
=== PAUSE TestDefaultConfig/#00
=== RUN   TestDefaultConfig/#01
=== PAUSE TestDefaultConfig/#01
=== RUN   TestDefaultConfig/#02
=== PAUSE TestDefaultConfig/#02
=== RUN   TestDefaultConfig/#03
=== PAUSE TestDefaultConfig/#03
=== RUN   TestDefaultConfig/#04
=== PAUSE TestDefaultConfig/#04
=== RUN   TestDefaultConfig/#05
=== PAUSE TestDefaultConfig/#05
=== RUN   TestDefaultConfig/#06
=== PAUSE TestDefaultConfig/#06
=== RUN   TestDefaultConfig/#07
=== PAUSE TestDefaultConfig/#07
=== RUN   TestDefaultConfig/#08
=== PAUSE TestDefaultConfig/#08
=== RUN   TestDefaultConfig/#09
=== PAUSE TestDefaultConfig/#09
=== RUN   TestDefaultConfig/#10
=== PAUSE TestDefaultConfig/#10
=== RUN   TestDefaultConfig/#11
=== PAUSE TestDefaultConfig/#11
=== RUN   TestDefaultConfig/#12
=== PAUSE TestDefaultConfig/#12
=== RUN   TestDefaultConfig/#13
=== PAUSE TestDefaultConfig/#13
=== RUN   TestDefaultConfig/#14
=== PAUSE TestDefaultConfig/#14
=== RUN   TestDefaultConfig/#15
=== PAUSE TestDefaultConfig/#15
=== RUN   TestDefaultConfig/#16
=== PAUSE TestDefaultConfig/#16
=== RUN   TestDefaultConfig/#17
jones - 2019/12/06 06:02:55.978093 [ERR] leaf watch error: invalid type for leaf response: <nil>
=== PAUSE TestDefaultConfig/#17
=== RUN   TestDefaultConfig/#18
=== PAUSE TestDefaultConfig/#18
=== RUN   TestDefaultConfig/#19
=== PAUSE TestDefaultConfig/#19
=== RUN   TestDefaultConfig/#20
=== PAUSE TestDefaultConfig/#20
=== RUN   TestDefaultConfig/#21
=== PAUSE TestDefaultConfig/#21
=== RUN   TestDefaultConfig/#22
=== PAUSE TestDefaultConfig/#22
=== RUN   TestDefaultConfig/#23
=== PAUSE TestDefaultConfig/#23
=== RUN   TestDefaultConfig/#24
=== PAUSE TestDefaultConfig/#24
=== RUN   TestDefaultConfig/#25
=== PAUSE TestDefaultConfig/#25
=== RUN   TestDefaultConfig/#26
=== PAUSE TestDefaultConfig/#26
=== RUN   TestDefaultConfig/#27
=== PAUSE TestDefaultConfig/#27
=== RUN   TestDefaultConfig/#28
=== PAUSE TestDefaultConfig/#28
=== RUN   TestDefaultConfig/#29
=== PAUSE TestDefaultConfig/#29
=== RUN   TestDefaultConfig/#30
=== PAUSE TestDefaultConfig/#30
=== RUN   TestDefaultConfig/#31
=== PAUSE TestDefaultConfig/#31
=== RUN   TestDefaultConfig/#32
=== PAUSE TestDefaultConfig/#32
=== RUN   TestDefaultConfig/#33
=== PAUSE TestDefaultConfig/#33
=== RUN   TestDefaultConfig/#34
=== PAUSE TestDefaultConfig/#34
=== RUN   TestDefaultConfig/#35
=== PAUSE TestDefaultConfig/#35
=== RUN   TestDefaultConfig/#36
=== PAUSE TestDefaultConfig/#36
=== RUN   TestDefaultConfig/#37
=== PAUSE TestDefaultConfig/#37
=== RUN   TestDefaultConfig/#38
=== PAUSE TestDefaultConfig/#38
=== RUN   TestDefaultConfig/#39
=== PAUSE TestDefaultConfig/#39
=== RUN   TestDefaultConfig/#40
=== PAUSE TestDefaultConfig/#40
=== RUN   TestDefaultConfig/#41
=== PAUSE TestDefaultConfig/#41
=== RUN   TestDefaultConfig/#42
=== PAUSE TestDefaultConfig/#42
=== RUN   TestDefaultConfig/#43
=== PAUSE TestDefaultConfig/#43
=== RUN   TestDefaultConfig/#44
=== PAUSE TestDefaultConfig/#44
=== RUN   TestDefaultConfig/#45
=== PAUSE TestDefaultConfig/#45
=== RUN   TestDefaultConfig/#46
=== PAUSE TestDefaultConfig/#46
=== RUN   TestDefaultConfig/#47
=== PAUSE TestDefaultConfig/#47
=== RUN   TestDefaultConfig/#48
=== PAUSE TestDefaultConfig/#48
=== RUN   TestDefaultConfig/#49
=== PAUSE TestDefaultConfig/#49
=== RUN   TestDefaultConfig/#50
=== PAUSE TestDefaultConfig/#50
=== RUN   TestDefaultConfig/#51
=== PAUSE TestDefaultConfig/#51
=== RUN   TestDefaultConfig/#52
=== PAUSE TestDefaultConfig/#52
=== RUN   TestDefaultConfig/#53
=== PAUSE TestDefaultConfig/#53
=== RUN   TestDefaultConfig/#54
=== PAUSE TestDefaultConfig/#54
=== RUN   TestDefaultConfig/#55
=== PAUSE TestDefaultConfig/#55
=== RUN   TestDefaultConfig/#56
=== PAUSE TestDefaultConfig/#56
=== RUN   TestDefaultConfig/#57
=== PAUSE TestDefaultConfig/#57
=== RUN   TestDefaultConfig/#58
=== PAUSE TestDefaultConfig/#58
=== RUN   TestDefaultConfig/#59
=== PAUSE TestDefaultConfig/#59
=== RUN   TestDefaultConfig/#60
=== PAUSE TestDefaultConfig/#60
=== RUN   TestDefaultConfig/#61
=== PAUSE TestDefaultConfig/#61
=== RUN   TestDefaultConfig/#62
=== PAUSE TestDefaultConfig/#62
=== RUN   TestDefaultConfig/#63
=== PAUSE TestDefaultConfig/#63
=== RUN   TestDefaultConfig/#64
=== PAUSE TestDefaultConfig/#64
=== RUN   TestDefaultConfig/#65
=== PAUSE TestDefaultConfig/#65
=== RUN   TestDefaultConfig/#66
=== PAUSE TestDefaultConfig/#66
=== RUN   TestDefaultConfig/#67
=== PAUSE TestDefaultConfig/#67
=== RUN   TestDefaultConfig/#68
=== PAUSE TestDefaultConfig/#68
=== RUN   TestDefaultConfig/#69
=== PAUSE TestDefaultConfig/#69
=== RUN   TestDefaultConfig/#70
=== PAUSE TestDefaultConfig/#70
=== RUN   TestDefaultConfig/#71
=== PAUSE TestDefaultConfig/#71
=== RUN   TestDefaultConfig/#72
=== PAUSE TestDefaultConfig/#72
=== RUN   TestDefaultConfig/#73
=== PAUSE TestDefaultConfig/#73
=== RUN   TestDefaultConfig/#74
=== PAUSE TestDefaultConfig/#74
=== RUN   TestDefaultConfig/#75
=== PAUSE TestDefaultConfig/#75
=== RUN   TestDefaultConfig/#76
=== PAUSE TestDefaultConfig/#76
=== RUN   TestDefaultConfig/#77
=== PAUSE TestDefaultConfig/#77
=== RUN   TestDefaultConfig/#78
=== PAUSE TestDefaultConfig/#78
=== RUN   TestDefaultConfig/#79
=== PAUSE TestDefaultConfig/#79
=== RUN   TestDefaultConfig/#80
=== PAUSE TestDefaultConfig/#80
=== RUN   TestDefaultConfig/#81
=== PAUSE TestDefaultConfig/#81
=== RUN   TestDefaultConfig/#82
=== PAUSE TestDefaultConfig/#82
=== RUN   TestDefaultConfig/#83
=== PAUSE TestDefaultConfig/#83
=== RUN   TestDefaultConfig/#84
=== PAUSE TestDefaultConfig/#84
=== RUN   TestDefaultConfig/#85
=== PAUSE TestDefaultConfig/#85
=== RUN   TestDefaultConfig/#86
=== PAUSE TestDefaultConfig/#86
=== RUN   TestDefaultConfig/#87
=== PAUSE TestDefaultConfig/#87
=== RUN   TestDefaultConfig/#88
=== PAUSE TestDefaultConfig/#88
=== RUN   TestDefaultConfig/#89
=== PAUSE TestDefaultConfig/#89
=== RUN   TestDefaultConfig/#90
=== PAUSE TestDefaultConfig/#90
=== RUN   TestDefaultConfig/#91
=== PAUSE TestDefaultConfig/#91
=== RUN   TestDefaultConfig/#92
=== PAUSE TestDefaultConfig/#92
=== RUN   TestDefaultConfig/#93
=== PAUSE TestDefaultConfig/#93
=== RUN   TestDefaultConfig/#94
=== PAUSE TestDefaultConfig/#94
=== RUN   TestDefaultConfig/#95
=== PAUSE TestDefaultConfig/#95
=== RUN   TestDefaultConfig/#96
=== PAUSE TestDefaultConfig/#96
=== RUN   TestDefaultConfig/#97
=== PAUSE TestDefaultConfig/#97
=== RUN   TestDefaultConfig/#98
=== PAUSE TestDefaultConfig/#98
=== RUN   TestDefaultConfig/#99
=== PAUSE TestDefaultConfig/#99
=== RUN   TestDefaultConfig/#100
=== PAUSE TestDefaultConfig/#100
=== RUN   TestDefaultConfig/#101
=== PAUSE TestDefaultConfig/#101
=== RUN   TestDefaultConfig/#102
=== PAUSE TestDefaultConfig/#102
=== RUN   TestDefaultConfig/#103
=== PAUSE TestDefaultConfig/#103
=== RUN   TestDefaultConfig/#104
=== PAUSE TestDefaultConfig/#104
=== RUN   TestDefaultConfig/#105
=== PAUSE TestDefaultConfig/#105
=== RUN   TestDefaultConfig/#106
=== PAUSE TestDefaultConfig/#106
=== RUN   TestDefaultConfig/#107
=== PAUSE TestDefaultConfig/#107
=== RUN   TestDefaultConfig/#108
=== PAUSE TestDefaultConfig/#108
=== RUN   TestDefaultConfig/#109
=== PAUSE TestDefaultConfig/#109
=== RUN   TestDefaultConfig/#110
=== PAUSE TestDefaultConfig/#110
=== RUN   TestDefaultConfig/#111
=== PAUSE TestDefaultConfig/#111
=== RUN   TestDefaultConfig/#112
=== PAUSE TestDefaultConfig/#112
=== RUN   TestDefaultConfig/#113
=== PAUSE TestDefaultConfig/#113
=== RUN   TestDefaultConfig/#114
=== PAUSE TestDefaultConfig/#114
=== RUN   TestDefaultConfig/#115
=== PAUSE TestDefaultConfig/#115
=== RUN   TestDefaultConfig/#116
=== PAUSE TestDefaultConfig/#116
=== RUN   TestDefaultConfig/#117
=== PAUSE TestDefaultConfig/#117
=== RUN   TestDefaultConfig/#118
=== PAUSE TestDefaultConfig/#118
=== RUN   TestDefaultConfig/#119
=== PAUSE TestDefaultConfig/#119
=== RUN   TestDefaultConfig/#120
=== PAUSE TestDefaultConfig/#120
=== RUN   TestDefaultConfig/#121
=== PAUSE TestDefaultConfig/#121
=== RUN   TestDefaultConfig/#122
=== PAUSE TestDefaultConfig/#122
=== RUN   TestDefaultConfig/#123
=== PAUSE TestDefaultConfig/#123
=== RUN   TestDefaultConfig/#124
=== PAUSE TestDefaultConfig/#124
=== RUN   TestDefaultConfig/#125
=== PAUSE TestDefaultConfig/#125
=== RUN   TestDefaultConfig/#126
=== PAUSE TestDefaultConfig/#126
=== RUN   TestDefaultConfig/#127
=== PAUSE TestDefaultConfig/#127
=== RUN   TestDefaultConfig/#128
=== PAUSE TestDefaultConfig/#128
=== RUN   TestDefaultConfig/#129
=== PAUSE TestDefaultConfig/#129
=== RUN   TestDefaultConfig/#130
=== PAUSE TestDefaultConfig/#130
=== RUN   TestDefaultConfig/#131
=== PAUSE TestDefaultConfig/#131
=== RUN   TestDefaultConfig/#132
=== PAUSE TestDefaultConfig/#132
=== RUN   TestDefaultConfig/#133
=== PAUSE TestDefaultConfig/#133
=== RUN   TestDefaultConfig/#134
=== PAUSE TestDefaultConfig/#134
=== RUN   TestDefaultConfig/#135
=== PAUSE TestDefaultConfig/#135
=== RUN   TestDefaultConfig/#136
=== PAUSE TestDefaultConfig/#136
=== RUN   TestDefaultConfig/#137
=== PAUSE TestDefaultConfig/#137
=== RUN   TestDefaultConfig/#138
=== PAUSE TestDefaultConfig/#138
=== RUN   TestDefaultConfig/#139
=== PAUSE TestDefaultConfig/#139
=== RUN   TestDefaultConfig/#140
=== PAUSE TestDefaultConfig/#140
=== RUN   TestDefaultConfig/#141
=== PAUSE TestDefaultConfig/#141
=== RUN   TestDefaultConfig/#142
=== PAUSE TestDefaultConfig/#142
=== RUN   TestDefaultConfig/#143
=== PAUSE TestDefaultConfig/#143
=== RUN   TestDefaultConfig/#144
=== PAUSE TestDefaultConfig/#144
=== RUN   TestDefaultConfig/#145
=== PAUSE TestDefaultConfig/#145
=== RUN   TestDefaultConfig/#146
=== PAUSE TestDefaultConfig/#146
=== RUN   TestDefaultConfig/#147
=== PAUSE TestDefaultConfig/#147
=== RUN   TestDefaultConfig/#148
=== PAUSE TestDefaultConfig/#148
=== RUN   TestDefaultConfig/#149
=== PAUSE TestDefaultConfig/#149
=== RUN   TestDefaultConfig/#150
=== PAUSE TestDefaultConfig/#150
=== RUN   TestDefaultConfig/#151
=== PAUSE TestDefaultConfig/#151
=== RUN   TestDefaultConfig/#152
=== PAUSE TestDefaultConfig/#152
=== RUN   TestDefaultConfig/#153
=== PAUSE TestDefaultConfig/#153
=== RUN   TestDefaultConfig/#154
=== PAUSE TestDefaultConfig/#154
=== RUN   TestDefaultConfig/#155
=== PAUSE TestDefaultConfig/#155
=== RUN   TestDefaultConfig/#156
=== PAUSE TestDefaultConfig/#156
=== RUN   TestDefaultConfig/#157
=== PAUSE TestDefaultConfig/#157
=== RUN   TestDefaultConfig/#158
=== PAUSE TestDefaultConfig/#158
=== RUN   TestDefaultConfig/#159
=== PAUSE TestDefaultConfig/#159
=== RUN   TestDefaultConfig/#160
=== PAUSE TestDefaultConfig/#160
=== RUN   TestDefaultConfig/#161
=== PAUSE TestDefaultConfig/#161
=== RUN   TestDefaultConfig/#162
=== PAUSE TestDefaultConfig/#162
=== RUN   TestDefaultConfig/#163
=== PAUSE TestDefaultConfig/#163
=== RUN   TestDefaultConfig/#164
=== PAUSE TestDefaultConfig/#164
=== RUN   TestDefaultConfig/#165
=== PAUSE TestDefaultConfig/#165
=== RUN   TestDefaultConfig/#166
=== PAUSE TestDefaultConfig/#166
=== RUN   TestDefaultConfig/#167
=== PAUSE TestDefaultConfig/#167
=== RUN   TestDefaultConfig/#168
=== PAUSE TestDefaultConfig/#168
=== RUN   TestDefaultConfig/#169
=== PAUSE TestDefaultConfig/#169
=== RUN   TestDefaultConfig/#170
=== PAUSE TestDefaultConfig/#170
=== RUN   TestDefaultConfig/#171
=== PAUSE TestDefaultConfig/#171
=== RUN   TestDefaultConfig/#172
=== PAUSE TestDefaultConfig/#172
=== RUN   TestDefaultConfig/#173
=== PAUSE TestDefaultConfig/#173
=== RUN   TestDefaultConfig/#174
=== PAUSE TestDefaultConfig/#174
=== RUN   TestDefaultConfig/#175
=== PAUSE TestDefaultConfig/#175
=== RUN   TestDefaultConfig/#176
=== PAUSE TestDefaultConfig/#176
=== RUN   TestDefaultConfig/#177
=== PAUSE TestDefaultConfig/#177
=== RUN   TestDefaultConfig/#178
=== PAUSE TestDefaultConfig/#178
=== RUN   TestDefaultConfig/#179
=== PAUSE TestDefaultConfig/#179
=== RUN   TestDefaultConfig/#180
=== PAUSE TestDefaultConfig/#180
=== RUN   TestDefaultConfig/#181
=== PAUSE TestDefaultConfig/#181
=== RUN   TestDefaultConfig/#182
=== PAUSE TestDefaultConfig/#182
=== RUN   TestDefaultConfig/#183
=== PAUSE TestDefaultConfig/#183
=== RUN   TestDefaultConfig/#184
=== PAUSE TestDefaultConfig/#184
=== RUN   TestDefaultConfig/#185
=== PAUSE TestDefaultConfig/#185
=== RUN   TestDefaultConfig/#186
=== PAUSE TestDefaultConfig/#186
=== RUN   TestDefaultConfig/#187
=== PAUSE TestDefaultConfig/#187
=== RUN   TestDefaultConfig/#188
=== PAUSE TestDefaultConfig/#188
=== RUN   TestDefaultConfig/#189
=== PAUSE TestDefaultConfig/#189
=== RUN   TestDefaultConfig/#190
=== PAUSE TestDefaultConfig/#190
=== RUN   TestDefaultConfig/#191
=== PAUSE TestDefaultConfig/#191
=== RUN   TestDefaultConfig/#192
=== PAUSE TestDefaultConfig/#192
=== RUN   TestDefaultConfig/#193
=== PAUSE TestDefaultConfig/#193
=== RUN   TestDefaultConfig/#194
=== PAUSE TestDefaultConfig/#194
=== RUN   TestDefaultConfig/#195
=== PAUSE TestDefaultConfig/#195
=== RUN   TestDefaultConfig/#196
=== PAUSE TestDefaultConfig/#196
=== RUN   TestDefaultConfig/#197
=== PAUSE TestDefaultConfig/#197
=== RUN   TestDefaultConfig/#198
=== PAUSE TestDefaultConfig/#198
=== RUN   TestDefaultConfig/#199
=== PAUSE TestDefaultConfig/#199
=== RUN   TestDefaultConfig/#200
=== PAUSE TestDefaultConfig/#200
=== RUN   TestDefaultConfig/#201
=== PAUSE TestDefaultConfig/#201
=== RUN   TestDefaultConfig/#202
=== PAUSE TestDefaultConfig/#202
=== RUN   TestDefaultConfig/#203
=== PAUSE TestDefaultConfig/#203
=== RUN   TestDefaultConfig/#204
=== PAUSE TestDefaultConfig/#204
=== RUN   TestDefaultConfig/#205
=== PAUSE TestDefaultConfig/#205
=== RUN   TestDefaultConfig/#206
=== PAUSE TestDefaultConfig/#206
=== RUN   TestDefaultConfig/#207
=== PAUSE TestDefaultConfig/#207
=== RUN   TestDefaultConfig/#208
=== PAUSE TestDefaultConfig/#208
=== RUN   TestDefaultConfig/#209
=== PAUSE TestDefaultConfig/#209
=== RUN   TestDefaultConfig/#210
=== PAUSE TestDefaultConfig/#210
=== RUN   TestDefaultConfig/#211
=== PAUSE TestDefaultConfig/#211
=== RUN   TestDefaultConfig/#212
=== PAUSE TestDefaultConfig/#212
=== RUN   TestDefaultConfig/#213
=== PAUSE TestDefaultConfig/#213
=== RUN   TestDefaultConfig/#214
=== PAUSE TestDefaultConfig/#214
=== RUN   TestDefaultConfig/#215
=== PAUSE TestDefaultConfig/#215
=== RUN   TestDefaultConfig/#216
=== PAUSE TestDefaultConfig/#216
=== RUN   TestDefaultConfig/#217
=== PAUSE TestDefaultConfig/#217
=== RUN   TestDefaultConfig/#218
=== PAUSE TestDefaultConfig/#218
=== RUN   TestDefaultConfig/#219
=== PAUSE TestDefaultConfig/#219
=== RUN   TestDefaultConfig/#220
=== PAUSE TestDefaultConfig/#220
=== RUN   TestDefaultConfig/#221
=== PAUSE TestDefaultConfig/#221
=== RUN   TestDefaultConfig/#222
=== PAUSE TestDefaultConfig/#222
=== RUN   TestDefaultConfig/#223
=== PAUSE TestDefaultConfig/#223
=== RUN   TestDefaultConfig/#224
=== PAUSE TestDefaultConfig/#224
=== RUN   TestDefaultConfig/#225
=== PAUSE TestDefaultConfig/#225
=== RUN   TestDefaultConfig/#226
=== PAUSE TestDefaultConfig/#226
=== RUN   TestDefaultConfig/#227
=== PAUSE TestDefaultConfig/#227
=== RUN   TestDefaultConfig/#228
=== PAUSE TestDefaultConfig/#228
=== RUN   TestDefaultConfig/#229
=== PAUSE TestDefaultConfig/#229
=== RUN   TestDefaultConfig/#230
=== PAUSE TestDefaultConfig/#230
=== RUN   TestDefaultConfig/#231
=== PAUSE TestDefaultConfig/#231
=== RUN   TestDefaultConfig/#232
=== PAUSE TestDefaultConfig/#232
=== RUN   TestDefaultConfig/#233
=== PAUSE TestDefaultConfig/#233
=== RUN   TestDefaultConfig/#234
=== PAUSE TestDefaultConfig/#234
=== RUN   TestDefaultConfig/#235
=== PAUSE TestDefaultConfig/#235
=== RUN   TestDefaultConfig/#236
=== PAUSE TestDefaultConfig/#236
=== RUN   TestDefaultConfig/#237
=== PAUSE TestDefaultConfig/#237
=== RUN   TestDefaultConfig/#238
=== PAUSE TestDefaultConfig/#238
=== RUN   TestDefaultConfig/#239
=== PAUSE TestDefaultConfig/#239
=== RUN   TestDefaultConfig/#240
=== PAUSE TestDefaultConfig/#240
=== RUN   TestDefaultConfig/#241
=== PAUSE TestDefaultConfig/#241
=== RUN   TestDefaultConfig/#242
=== PAUSE TestDefaultConfig/#242
=== RUN   TestDefaultConfig/#243
=== PAUSE TestDefaultConfig/#243
=== RUN   TestDefaultConfig/#244
=== PAUSE TestDefaultConfig/#244
=== RUN   TestDefaultConfig/#245
=== PAUSE TestDefaultConfig/#245
=== RUN   TestDefaultConfig/#246
=== PAUSE TestDefaultConfig/#246
=== RUN   TestDefaultConfig/#247
=== PAUSE TestDefaultConfig/#247
=== RUN   TestDefaultConfig/#248
=== PAUSE TestDefaultConfig/#248
=== RUN   TestDefaultConfig/#249
=== PAUSE TestDefaultConfig/#249
=== RUN   TestDefaultConfig/#250
=== PAUSE TestDefaultConfig/#250
=== RUN   TestDefaultConfig/#251
=== PAUSE TestDefaultConfig/#251
=== RUN   TestDefaultConfig/#252
=== PAUSE TestDefaultConfig/#252
=== RUN   TestDefaultConfig/#253
=== PAUSE TestDefaultConfig/#253
=== RUN   TestDefaultConfig/#254
=== PAUSE TestDefaultConfig/#254
=== RUN   TestDefaultConfig/#255
=== PAUSE TestDefaultConfig/#255
=== RUN   TestDefaultConfig/#256
=== PAUSE TestDefaultConfig/#256
=== RUN   TestDefaultConfig/#257
=== PAUSE TestDefaultConfig/#257
=== RUN   TestDefaultConfig/#258
=== PAUSE TestDefaultConfig/#258
=== RUN   TestDefaultConfig/#259
=== PAUSE TestDefaultConfig/#259
=== RUN   TestDefaultConfig/#260
=== PAUSE TestDefaultConfig/#260
=== RUN   TestDefaultConfig/#261
=== PAUSE TestDefaultConfig/#261
=== RUN   TestDefaultConfig/#262
=== PAUSE TestDefaultConfig/#262
=== RUN   TestDefaultConfig/#263
=== PAUSE TestDefaultConfig/#263
=== RUN   TestDefaultConfig/#264
=== PAUSE TestDefaultConfig/#264
=== RUN   TestDefaultConfig/#265
=== PAUSE TestDefaultConfig/#265
=== RUN   TestDefaultConfig/#266
=== PAUSE TestDefaultConfig/#266
=== RUN   TestDefaultConfig/#267
=== PAUSE TestDefaultConfig/#267
=== RUN   TestDefaultConfig/#268
=== PAUSE TestDefaultConfig/#268
=== RUN   TestDefaultConfig/#269
=== PAUSE TestDefaultConfig/#269
=== RUN   TestDefaultConfig/#270
=== PAUSE TestDefaultConfig/#270
=== RUN   TestDefaultConfig/#271
=== PAUSE TestDefaultConfig/#271
=== RUN   TestDefaultConfig/#272
=== PAUSE TestDefaultConfig/#272
=== RUN   TestDefaultConfig/#273
=== PAUSE TestDefaultConfig/#273
=== RUN   TestDefaultConfig/#274
=== PAUSE TestDefaultConfig/#274
=== RUN   TestDefaultConfig/#275
=== PAUSE TestDefaultConfig/#275
=== RUN   TestDefaultConfig/#276
=== PAUSE TestDefaultConfig/#276
=== RUN   TestDefaultConfig/#277
=== PAUSE TestDefaultConfig/#277
=== RUN   TestDefaultConfig/#278
=== PAUSE TestDefaultConfig/#278
=== RUN   TestDefaultConfig/#279
=== PAUSE TestDefaultConfig/#279
=== RUN   TestDefaultConfig/#280
=== PAUSE TestDefaultConfig/#280
=== RUN   TestDefaultConfig/#281
=== PAUSE TestDefaultConfig/#281
=== RUN   TestDefaultConfig/#282
=== PAUSE TestDefaultConfig/#282
=== RUN   TestDefaultConfig/#283
=== PAUSE TestDefaultConfig/#283
=== RUN   TestDefaultConfig/#284
=== PAUSE TestDefaultConfig/#284
=== RUN   TestDefaultConfig/#285
=== PAUSE TestDefaultConfig/#285
=== RUN   TestDefaultConfig/#286
=== PAUSE TestDefaultConfig/#286
=== RUN   TestDefaultConfig/#287
=== PAUSE TestDefaultConfig/#287
=== RUN   TestDefaultConfig/#288
=== PAUSE TestDefaultConfig/#288
=== RUN   TestDefaultConfig/#289
=== PAUSE TestDefaultConfig/#289
=== RUN   TestDefaultConfig/#290
=== PAUSE TestDefaultConfig/#290
=== RUN   TestDefaultConfig/#291
=== PAUSE TestDefaultConfig/#291
=== RUN   TestDefaultConfig/#292
=== PAUSE TestDefaultConfig/#292
=== RUN   TestDefaultConfig/#293
=== PAUSE TestDefaultConfig/#293
=== RUN   TestDefaultConfig/#294
=== PAUSE TestDefaultConfig/#294
=== RUN   TestDefaultConfig/#295
=== PAUSE TestDefaultConfig/#295
=== RUN   TestDefaultConfig/#296
=== PAUSE TestDefaultConfig/#296
=== RUN   TestDefaultConfig/#297
=== PAUSE TestDefaultConfig/#297
=== RUN   TestDefaultConfig/#298
=== PAUSE TestDefaultConfig/#298
=== RUN   TestDefaultConfig/#299
=== PAUSE TestDefaultConfig/#299
=== RUN   TestDefaultConfig/#300
=== PAUSE TestDefaultConfig/#300
=== RUN   TestDefaultConfig/#301
=== PAUSE TestDefaultConfig/#301
=== RUN   TestDefaultConfig/#302
=== PAUSE TestDefaultConfig/#302
=== RUN   TestDefaultConfig/#303
=== PAUSE TestDefaultConfig/#303
=== RUN   TestDefaultConfig/#304
=== PAUSE TestDefaultConfig/#304
=== RUN   TestDefaultConfig/#305
=== PAUSE TestDefaultConfig/#305
=== RUN   TestDefaultConfig/#306
=== PAUSE TestDefaultConfig/#306
=== RUN   TestDefaultConfig/#307
=== PAUSE TestDefaultConfig/#307
=== RUN   TestDefaultConfig/#308
=== PAUSE TestDefaultConfig/#308
=== RUN   TestDefaultConfig/#309
=== PAUSE TestDefaultConfig/#309
=== RUN   TestDefaultConfig/#310
=== PAUSE TestDefaultConfig/#310
=== RUN   TestDefaultConfig/#311
=== PAUSE TestDefaultConfig/#311
=== RUN   TestDefaultConfig/#312
=== PAUSE TestDefaultConfig/#312
=== RUN   TestDefaultConfig/#313
=== PAUSE TestDefaultConfig/#313
=== RUN   TestDefaultConfig/#314
=== PAUSE TestDefaultConfig/#314
=== RUN   TestDefaultConfig/#315
=== PAUSE TestDefaultConfig/#315
=== RUN   TestDefaultConfig/#316
=== PAUSE TestDefaultConfig/#316
=== RUN   TestDefaultConfig/#317
=== PAUSE TestDefaultConfig/#317
=== RUN   TestDefaultConfig/#318
=== PAUSE TestDefaultConfig/#318
=== RUN   TestDefaultConfig/#319
=== PAUSE TestDefaultConfig/#319
=== RUN   TestDefaultConfig/#320
=== PAUSE TestDefaultConfig/#320
=== RUN   TestDefaultConfig/#321
=== PAUSE TestDefaultConfig/#321
=== RUN   TestDefaultConfig/#322
=== PAUSE TestDefaultConfig/#322
=== RUN   TestDefaultConfig/#323
=== PAUSE TestDefaultConfig/#323
=== RUN   TestDefaultConfig/#324
=== PAUSE TestDefaultConfig/#324
=== RUN   TestDefaultConfig/#325
=== PAUSE TestDefaultConfig/#325
=== RUN   TestDefaultConfig/#326
=== PAUSE TestDefaultConfig/#326
=== RUN   TestDefaultConfig/#327
=== PAUSE TestDefaultConfig/#327
=== RUN   TestDefaultConfig/#328
=== PAUSE TestDefaultConfig/#328
=== RUN   TestDefaultConfig/#329
=== PAUSE TestDefaultConfig/#329
=== RUN   TestDefaultConfig/#330
=== PAUSE TestDefaultConfig/#330
=== RUN   TestDefaultConfig/#331
=== PAUSE TestDefaultConfig/#331
=== RUN   TestDefaultConfig/#332
=== PAUSE TestDefaultConfig/#332
=== RUN   TestDefaultConfig/#333
=== PAUSE TestDefaultConfig/#333
=== RUN   TestDefaultConfig/#334
=== PAUSE TestDefaultConfig/#334
=== RUN   TestDefaultConfig/#335
=== PAUSE TestDefaultConfig/#335
=== RUN   TestDefaultConfig/#336
=== PAUSE TestDefaultConfig/#336
=== RUN   TestDefaultConfig/#337
=== PAUSE TestDefaultConfig/#337
=== RUN   TestDefaultConfig/#338
=== PAUSE TestDefaultConfig/#338
=== RUN   TestDefaultConfig/#339
=== PAUSE TestDefaultConfig/#339
=== RUN   TestDefaultConfig/#340
=== PAUSE TestDefaultConfig/#340
=== RUN   TestDefaultConfig/#341
=== PAUSE TestDefaultConfig/#341
=== RUN   TestDefaultConfig/#342
=== PAUSE TestDefaultConfig/#342
=== RUN   TestDefaultConfig/#343
=== PAUSE TestDefaultConfig/#343
=== RUN   TestDefaultConfig/#344
=== PAUSE TestDefaultConfig/#344
=== RUN   TestDefaultConfig/#345
=== PAUSE TestDefaultConfig/#345
=== RUN   TestDefaultConfig/#346
=== PAUSE TestDefaultConfig/#346
=== RUN   TestDefaultConfig/#347
=== PAUSE TestDefaultConfig/#347
=== RUN   TestDefaultConfig/#348
=== PAUSE TestDefaultConfig/#348
=== RUN   TestDefaultConfig/#349
=== PAUSE TestDefaultConfig/#349
=== RUN   TestDefaultConfig/#350
=== PAUSE TestDefaultConfig/#350
=== RUN   TestDefaultConfig/#351
=== PAUSE TestDefaultConfig/#351
=== RUN   TestDefaultConfig/#352
=== PAUSE TestDefaultConfig/#352
=== RUN   TestDefaultConfig/#353
=== PAUSE TestDefaultConfig/#353
=== RUN   TestDefaultConfig/#354
=== PAUSE TestDefaultConfig/#354
=== RUN   TestDefaultConfig/#355
=== PAUSE TestDefaultConfig/#355
=== RUN   TestDefaultConfig/#356
=== PAUSE TestDefaultConfig/#356
=== RUN   TestDefaultConfig/#357
=== PAUSE TestDefaultConfig/#357
=== RUN   TestDefaultConfig/#358
=== PAUSE TestDefaultConfig/#358
=== RUN   TestDefaultConfig/#359
=== PAUSE TestDefaultConfig/#359
=== RUN   TestDefaultConfig/#360
=== PAUSE TestDefaultConfig/#360
=== RUN   TestDefaultConfig/#361
=== PAUSE TestDefaultConfig/#361
=== RUN   TestDefaultConfig/#362
=== PAUSE TestDefaultConfig/#362
=== RUN   TestDefaultConfig/#363
=== PAUSE TestDefaultConfig/#363
=== RUN   TestDefaultConfig/#364
=== PAUSE TestDefaultConfig/#364
=== RUN   TestDefaultConfig/#365
=== PAUSE TestDefaultConfig/#365
=== RUN   TestDefaultConfig/#366
=== PAUSE TestDefaultConfig/#366
=== RUN   TestDefaultConfig/#367
=== PAUSE TestDefaultConfig/#367
=== RUN   TestDefaultConfig/#368
=== PAUSE TestDefaultConfig/#368
=== RUN   TestDefaultConfig/#369
=== PAUSE TestDefaultConfig/#369
=== RUN   TestDefaultConfig/#370
=== PAUSE TestDefaultConfig/#370
=== RUN   TestDefaultConfig/#371
=== PAUSE TestDefaultConfig/#371
=== RUN   TestDefaultConfig/#372
=== PAUSE TestDefaultConfig/#372
=== RUN   TestDefaultConfig/#373
=== PAUSE TestDefaultConfig/#373
=== RUN   TestDefaultConfig/#374
=== PAUSE TestDefaultConfig/#374
=== RUN   TestDefaultConfig/#375
=== PAUSE TestDefaultConfig/#375
=== RUN   TestDefaultConfig/#376
=== PAUSE TestDefaultConfig/#376
=== RUN   TestDefaultConfig/#377
=== PAUSE TestDefaultConfig/#377
=== RUN   TestDefaultConfig/#378
=== PAUSE TestDefaultConfig/#378
=== RUN   TestDefaultConfig/#379
=== PAUSE TestDefaultConfig/#379
=== RUN   TestDefaultConfig/#380
=== PAUSE TestDefaultConfig/#380
=== RUN   TestDefaultConfig/#381
=== PAUSE TestDefaultConfig/#381
=== RUN   TestDefaultConfig/#382
=== PAUSE TestDefaultConfig/#382
=== RUN   TestDefaultConfig/#383
=== PAUSE TestDefaultConfig/#383
=== RUN   TestDefaultConfig/#384
=== PAUSE TestDefaultConfig/#384
=== RUN   TestDefaultConfig/#385
=== PAUSE TestDefaultConfig/#385
=== RUN   TestDefaultConfig/#386
=== PAUSE TestDefaultConfig/#386
=== RUN   TestDefaultConfig/#387
=== PAUSE TestDefaultConfig/#387
=== RUN   TestDefaultConfig/#388
=== PAUSE TestDefaultConfig/#388
=== RUN   TestDefaultConfig/#389
=== PAUSE TestDefaultConfig/#389
=== RUN   TestDefaultConfig/#390
=== PAUSE TestDefaultConfig/#390
=== RUN   TestDefaultConfig/#391
=== PAUSE TestDefaultConfig/#391
=== RUN   TestDefaultConfig/#392
=== PAUSE TestDefaultConfig/#392
=== RUN   TestDefaultConfig/#393
=== PAUSE TestDefaultConfig/#393
=== RUN   TestDefaultConfig/#394
=== PAUSE TestDefaultConfig/#394
=== RUN   TestDefaultConfig/#395
=== PAUSE TestDefaultConfig/#395
=== RUN   TestDefaultConfig/#396
=== PAUSE TestDefaultConfig/#396
=== RUN   TestDefaultConfig/#397
=== PAUSE TestDefaultConfig/#397
=== RUN   TestDefaultConfig/#398
=== PAUSE TestDefaultConfig/#398
=== RUN   TestDefaultConfig/#399
=== PAUSE TestDefaultConfig/#399
=== RUN   TestDefaultConfig/#400
=== PAUSE TestDefaultConfig/#400
=== RUN   TestDefaultConfig/#401
=== PAUSE TestDefaultConfig/#401
=== RUN   TestDefaultConfig/#402
=== PAUSE TestDefaultConfig/#402
=== RUN   TestDefaultConfig/#403
=== PAUSE TestDefaultConfig/#403
=== RUN   TestDefaultConfig/#404
=== PAUSE TestDefaultConfig/#404
=== RUN   TestDefaultConfig/#405
=== PAUSE TestDefaultConfig/#405
=== RUN   TestDefaultConfig/#406
=== PAUSE TestDefaultConfig/#406
=== RUN   TestDefaultConfig/#407
=== PAUSE TestDefaultConfig/#407
=== RUN   TestDefaultConfig/#408
=== PAUSE TestDefaultConfig/#408
=== RUN   TestDefaultConfig/#409
=== PAUSE TestDefaultConfig/#409
=== RUN   TestDefaultConfig/#410
=== PAUSE TestDefaultConfig/#410
=== RUN   TestDefaultConfig/#411
=== PAUSE TestDefaultConfig/#411
=== RUN   TestDefaultConfig/#412
=== PAUSE TestDefaultConfig/#412
=== RUN   TestDefaultConfig/#413
=== PAUSE TestDefaultConfig/#413
=== RUN   TestDefaultConfig/#414
=== PAUSE TestDefaultConfig/#414
=== RUN   TestDefaultConfig/#415
=== PAUSE TestDefaultConfig/#415
=== RUN   TestDefaultConfig/#416
=== PAUSE TestDefaultConfig/#416
=== RUN   TestDefaultConfig/#417
=== PAUSE TestDefaultConfig/#417
=== RUN   TestDefaultConfig/#418
=== PAUSE TestDefaultConfig/#418
=== RUN   TestDefaultConfig/#419
=== PAUSE TestDefaultConfig/#419
=== RUN   TestDefaultConfig/#420
=== PAUSE TestDefaultConfig/#420
=== RUN   TestDefaultConfig/#421
=== PAUSE TestDefaultConfig/#421
=== RUN   TestDefaultConfig/#422
=== PAUSE TestDefaultConfig/#422
=== RUN   TestDefaultConfig/#423
=== PAUSE TestDefaultConfig/#423
=== RUN   TestDefaultConfig/#424
=== PAUSE TestDefaultConfig/#424
=== RUN   TestDefaultConfig/#425
=== PAUSE TestDefaultConfig/#425
=== RUN   TestDefaultConfig/#426
=== PAUSE TestDefaultConfig/#426
=== RUN   TestDefaultConfig/#427
=== PAUSE TestDefaultConfig/#427
=== RUN   TestDefaultConfig/#428
=== PAUSE TestDefaultConfig/#428
=== RUN   TestDefaultConfig/#429
=== PAUSE TestDefaultConfig/#429
=== RUN   TestDefaultConfig/#430
=== PAUSE TestDefaultConfig/#430
=== RUN   TestDefaultConfig/#431
=== PAUSE TestDefaultConfig/#431
=== RUN   TestDefaultConfig/#432
=== PAUSE TestDefaultConfig/#432
=== RUN   TestDefaultConfig/#433
=== PAUSE TestDefaultConfig/#433
=== RUN   TestDefaultConfig/#434
=== PAUSE TestDefaultConfig/#434
=== RUN   TestDefaultConfig/#435
=== PAUSE TestDefaultConfig/#435
=== RUN   TestDefaultConfig/#436
=== PAUSE TestDefaultConfig/#436
=== RUN   TestDefaultConfig/#437
=== PAUSE TestDefaultConfig/#437
=== RUN   TestDefaultConfig/#438
=== PAUSE TestDefaultConfig/#438
=== RUN   TestDefaultConfig/#439
=== PAUSE TestDefaultConfig/#439
=== RUN   TestDefaultConfig/#440
=== PAUSE TestDefaultConfig/#440
=== RUN   TestDefaultConfig/#441
=== PAUSE TestDefaultConfig/#441
=== RUN   TestDefaultConfig/#442
=== PAUSE TestDefaultConfig/#442
=== RUN   TestDefaultConfig/#443
=== PAUSE TestDefaultConfig/#443
=== RUN   TestDefaultConfig/#444
=== PAUSE TestDefaultConfig/#444
=== RUN   TestDefaultConfig/#445
=== PAUSE TestDefaultConfig/#445
=== RUN   TestDefaultConfig/#446
=== PAUSE TestDefaultConfig/#446
=== RUN   TestDefaultConfig/#447
=== PAUSE TestDefaultConfig/#447
=== RUN   TestDefaultConfig/#448
=== PAUSE TestDefaultConfig/#448
=== RUN   TestDefaultConfig/#449
=== PAUSE TestDefaultConfig/#449
=== RUN   TestDefaultConfig/#450
=== PAUSE TestDefaultConfig/#450
=== RUN   TestDefaultConfig/#451
=== PAUSE TestDefaultConfig/#451
=== RUN   TestDefaultConfig/#452
=== PAUSE TestDefaultConfig/#452
=== RUN   TestDefaultConfig/#453
=== PAUSE TestDefaultConfig/#453
=== RUN   TestDefaultConfig/#454
=== PAUSE TestDefaultConfig/#454
=== RUN   TestDefaultConfig/#455
=== PAUSE TestDefaultConfig/#455
=== RUN   TestDefaultConfig/#456
=== PAUSE TestDefaultConfig/#456
=== RUN   TestDefaultConfig/#457
=== PAUSE TestDefaultConfig/#457
=== RUN   TestDefaultConfig/#458
=== PAUSE TestDefaultConfig/#458
=== RUN   TestDefaultConfig/#459
=== PAUSE TestDefaultConfig/#459
=== RUN   TestDefaultConfig/#460
=== PAUSE TestDefaultConfig/#460
=== RUN   TestDefaultConfig/#461
=== PAUSE TestDefaultConfig/#461
=== RUN   TestDefaultConfig/#462
=== PAUSE TestDefaultConfig/#462
=== RUN   TestDefaultConfig/#463
=== PAUSE TestDefaultConfig/#463
=== RUN   TestDefaultConfig/#464
=== PAUSE TestDefaultConfig/#464
=== RUN   TestDefaultConfig/#465
=== PAUSE TestDefaultConfig/#465
=== RUN   TestDefaultConfig/#466
=== PAUSE TestDefaultConfig/#466
=== RUN   TestDefaultConfig/#467
=== PAUSE TestDefaultConfig/#467
=== RUN   TestDefaultConfig/#468
=== PAUSE TestDefaultConfig/#468
=== RUN   TestDefaultConfig/#469
=== PAUSE TestDefaultConfig/#469
=== RUN   TestDefaultConfig/#470
=== PAUSE TestDefaultConfig/#470
=== RUN   TestDefaultConfig/#471
=== PAUSE TestDefaultConfig/#471
=== RUN   TestDefaultConfig/#472
=== PAUSE TestDefaultConfig/#472
=== RUN   TestDefaultConfig/#473
=== PAUSE TestDefaultConfig/#473
=== RUN   TestDefaultConfig/#474
=== PAUSE TestDefaultConfig/#474
=== RUN   TestDefaultConfig/#475
=== PAUSE TestDefaultConfig/#475
=== RUN   TestDefaultConfig/#476
=== PAUSE TestDefaultConfig/#476
=== RUN   TestDefaultConfig/#477
=== PAUSE TestDefaultConfig/#477
=== RUN   TestDefaultConfig/#478
=== PAUSE TestDefaultConfig/#478
=== RUN   TestDefaultConfig/#479
=== PAUSE TestDefaultConfig/#479
=== RUN   TestDefaultConfig/#480
=== PAUSE TestDefaultConfig/#480
=== RUN   TestDefaultConfig/#481
=== PAUSE TestDefaultConfig/#481
=== RUN   TestDefaultConfig/#482
=== PAUSE TestDefaultConfig/#482
=== RUN   TestDefaultConfig/#483
=== PAUSE TestDefaultConfig/#483
=== RUN   TestDefaultConfig/#484
=== PAUSE TestDefaultConfig/#484
=== RUN   TestDefaultConfig/#485
=== PAUSE TestDefaultConfig/#485
=== RUN   TestDefaultConfig/#486
=== PAUSE TestDefaultConfig/#486
=== RUN   TestDefaultConfig/#487
=== PAUSE TestDefaultConfig/#487
=== RUN   TestDefaultConfig/#488
=== PAUSE TestDefaultConfig/#488
=== RUN   TestDefaultConfig/#489
=== PAUSE TestDefaultConfig/#489
=== RUN   TestDefaultConfig/#490
=== PAUSE TestDefaultConfig/#490
=== RUN   TestDefaultConfig/#491
=== PAUSE TestDefaultConfig/#491
=== RUN   TestDefaultConfig/#492
=== PAUSE TestDefaultConfig/#492
=== RUN   TestDefaultConfig/#493
=== PAUSE TestDefaultConfig/#493
=== RUN   TestDefaultConfig/#494
=== PAUSE TestDefaultConfig/#494
=== RUN   TestDefaultConfig/#495
=== PAUSE TestDefaultConfig/#495
=== RUN   TestDefaultConfig/#496
=== PAUSE TestDefaultConfig/#496
=== RUN   TestDefaultConfig/#497
=== PAUSE TestDefaultConfig/#497
=== RUN   TestDefaultConfig/#498
=== PAUSE TestDefaultConfig/#498
=== RUN   TestDefaultConfig/#499
=== PAUSE TestDefaultConfig/#499
=== CONT  TestDefaultConfig/#489
=== CONT  TestDefaultConfig/#00
=== CONT  TestDefaultConfig/#488
=== CONT  TestDefaultConfig/#474
jones - 2019/12/06 06:02:56.259951 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
=== CONT  TestDefaultConfig/#422
=== CONT  TestDefaultConfig/#421
=== CONT  TestDefaultConfig/#420
=== CONT  TestDefaultConfig/#419
=== CONT  TestDefaultConfig/#418
=== CONT  TestDefaultConfig/#417
=== CONT  TestDefaultConfig/#416
=== CONT  TestDefaultConfig/#415
jones - 2019/12/06 06:02:56.622077 [INFO] agent: Synced service "web1-sidecar-proxy"
jones - 2019/12/06 06:02:56.622154 [DEBUG] agent: Node info in sync
=== CONT  TestDefaultConfig/#414
=== CONT  TestDefaultConfig/#413
=== CONT  TestDefaultConfig/#412
=== CONT  TestDefaultConfig/#411
=== CONT  TestDefaultConfig/#410
=== CONT  TestDefaultConfig/#409
=== CONT  TestDefaultConfig/#408
=== CONT  TestDefaultConfig/#407
=== CONT  TestDefaultConfig/#406
=== CONT  TestDefaultConfig/#405
=== CONT  TestDefaultConfig/#404
=== CONT  TestDefaultConfig/#403
=== CONT  TestDefaultConfig/#402
=== CONT  TestDefaultConfig/#401
=== CONT  TestDefaultConfig/#400
=== CONT  TestDefaultConfig/#399
=== CONT  TestDefaultConfig/#398
=== CONT  TestDefaultConfig/#397
=== CONT  TestDefaultConfig/#396
=== CONT  TestDefaultConfig/#395
=== CONT  TestDefaultConfig/#394
jones - 2019/12/06 06:02:57.800469 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/12/06 06:02:57.800564 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:02:57.800660 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/12/06 06:02:57.800699 [DEBUG] agent: Node info in sync
=== CONT  TestDefaultConfig/#393
=== CONT  TestDefaultConfig/#392
=== CONT  TestDefaultConfig/#391
=== CONT  TestDefaultConfig/#390
=== CONT  TestDefaultConfig/#389
=== CONT  TestDefaultConfig/#388
=== CONT  TestDefaultConfig/#387
=== CONT  TestDefaultConfig/#386
=== CONT  TestDefaultConfig/#385
=== CONT  TestDefaultConfig/#384
=== CONT  TestDefaultConfig/#373
=== CONT  TestDefaultConfig/#383
=== CONT  TestDefaultConfig/#382
=== CONT  TestDefaultConfig/#381
=== CONT  TestDefaultConfig/#380
jones - 2019/12/06 06:02:58.501640 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/12/06 06:02:58.502285 [DEBUG] consul: Skipping self join check for "Node 48766921-a98a-d876-447b-181b21701746" since the cluster is too small
jones - 2019/12/06 06:02:58.502535 [INFO] consul: member 'Node 48766921-a98a-d876-447b-181b21701746' joined, marking health alive
=== CONT  TestDefaultConfig/#379
=== CONT  TestDefaultConfig/#378
=== CONT  TestDefaultConfig/#377
=== CONT  TestDefaultConfig/#376
=== CONT  TestDefaultConfig/#375
=== CONT  TestDefaultConfig/#374
=== CONT  TestDefaultConfig/#372
=== CONT  TestDefaultConfig/#371
=== CONT  TestDefaultConfig/#370
=== CONT  TestDefaultConfig/#369
jones - 2019/12/06 06:02:59.034885 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
=== CONT  TestDefaultConfig/#368
=== CONT  TestDefaultConfig/#367
=== CONT  TestDefaultConfig/#366
=== CONT  TestDefaultConfig/#365
=== CONT  TestDefaultConfig/#364
=== CONT  TestDefaultConfig/#363
=== CONT  TestDefaultConfig/#362
=== CONT  TestDefaultConfig/#361
=== CONT  TestDefaultConfig/#360
=== CONT  TestDefaultConfig/#231
=== CONT  TestDefaultConfig/#359
=== CONT  TestDefaultConfig/#358
=== CONT  TestDefaultConfig/#357
=== CONT  TestDefaultConfig/#356
=== CONT  TestDefaultConfig/#355
=== CONT  TestDefaultConfig/#354
=== CONT  TestDefaultConfig/#353
=== CONT  TestDefaultConfig/#352
=== CONT  TestDefaultConfig/#351
=== CONT  TestDefaultConfig/#350
=== CONT  TestDefaultConfig/#349
=== CONT  TestDefaultConfig/#348
=== CONT  TestDefaultConfig/#347
=== CONT  TestDefaultConfig/#346
=== CONT  TestDefaultConfig/#345
=== CONT  TestDefaultConfig/#344
=== CONT  TestDefaultConfig/#343
=== CONT  TestDefaultConfig/#342
=== CONT  TestDefaultConfig/#341
=== CONT  TestDefaultConfig/#340
=== CONT  TestDefaultConfig/#339
=== CONT  TestDefaultConfig/#338
=== CONT  TestDefaultConfig/#337
=== CONT  TestDefaultConfig/#336
=== CONT  TestDefaultConfig/#335
=== CONT  TestDefaultConfig/#334
=== CONT  TestDefaultConfig/#333
=== CONT  TestDefaultConfig/#332
=== CONT  TestDefaultConfig/#331
=== CONT  TestDefaultConfig/#330
=== CONT  TestDefaultConfig/#329
=== CONT  TestDefaultConfig/#328
=== CONT  TestDefaultConfig/#327
=== CONT  TestDefaultConfig/#326
=== CONT  TestDefaultConfig/#325
=== CONT  TestDefaultConfig/#324
=== CONT  TestDefaultConfig/#308
=== CONT  TestDefaultConfig/#323
=== CONT  TestDefaultConfig/#322
=== CONT  TestDefaultConfig/#321
=== CONT  TestDefaultConfig/#320
=== CONT  TestDefaultConfig/#319
=== CONT  TestDefaultConfig/#318
=== CONT  TestDefaultConfig/#317
=== CONT  TestDefaultConfig/#316
=== CONT  TestDefaultConfig/#315
=== CONT  TestDefaultConfig/#314
=== CONT  TestDefaultConfig/#313
=== CONT  TestDefaultConfig/#312
=== CONT  TestDefaultConfig/#311
=== CONT  TestDefaultConfig/#310
=== CONT  TestDefaultConfig/#309
=== CONT  TestDefaultConfig/#307
=== CONT  TestDefaultConfig/#306
=== CONT  TestDefaultConfig/#305
=== CONT  TestDefaultConfig/#304
=== CONT  TestDefaultConfig/#303
=== CONT  TestDefaultConfig/#302
=== CONT  TestDefaultConfig/#301
=== CONT  TestDefaultConfig/#300
=== CONT  TestDefaultConfig/#299
=== CONT  TestDefaultConfig/#298
=== CONT  TestDefaultConfig/#297
=== CONT  TestDefaultConfig/#296
=== CONT  TestDefaultConfig/#295
=== CONT  TestDefaultConfig/#294
=== CONT  TestDefaultConfig/#293
=== CONT  TestDefaultConfig/#292
=== CONT  TestDefaultConfig/#291
=== CONT  TestDefaultConfig/#290
=== CONT  TestDefaultConfig/#289
=== CONT  TestDefaultConfig/#288
=== CONT  TestDefaultConfig/#287
=== CONT  TestDefaultConfig/#286
=== CONT  TestDefaultConfig/#285
=== CONT  TestDefaultConfig/#284
=== CONT  TestDefaultConfig/#283
=== CONT  TestDefaultConfig/#282
=== CONT  TestDefaultConfig/#281
=== CONT  TestDefaultConfig/#280
=== CONT  TestDefaultConfig/#279
=== CONT  TestDefaultConfig/#278
=== CONT  TestDefaultConfig/#277
=== CONT  TestDefaultConfig/#276
=== CONT  TestDefaultConfig/#275
=== CONT  TestDefaultConfig/#274
=== CONT  TestDefaultConfig/#273
=== CONT  TestDefaultConfig/#272
=== CONT  TestDefaultConfig/#271
=== CONT  TestDefaultConfig/#270
=== CONT  TestDefaultConfig/#269
=== CONT  TestDefaultConfig/#268
=== CONT  TestDefaultConfig/#267
=== CONT  TestDefaultConfig/#266
=== CONT  TestDefaultConfig/#265
=== CONT  TestDefaultConfig/#264
=== CONT  TestDefaultConfig/#263
=== CONT  TestDefaultConfig/#258
=== CONT  TestDefaultConfig/#262
=== CONT  TestDefaultConfig/#261
=== CONT  TestDefaultConfig/#260
=== CONT  TestDefaultConfig/#259
=== CONT  TestDefaultConfig/#257
=== CONT  TestDefaultConfig/#256
=== CONT  TestDefaultConfig/#255
=== CONT  TestDefaultConfig/#254
=== CONT  TestDefaultConfig/#253
=== CONT  TestDefaultConfig/#252
=== CONT  TestDefaultConfig/#251
=== CONT  TestDefaultConfig/#250
=== CONT  TestDefaultConfig/#249
=== CONT  TestDefaultConfig/#248
=== CONT  TestDefaultConfig/#247
=== CONT  TestDefaultConfig/#246
=== CONT  TestDefaultConfig/#245
=== CONT  TestDefaultConfig/#244
=== CONT  TestDefaultConfig/#243
=== CONT  TestDefaultConfig/#242
=== CONT  TestDefaultConfig/#241
=== CONT  TestDefaultConfig/#240
=== CONT  TestDefaultConfig/#239
=== CONT  TestDefaultConfig/#238
=== CONT  TestDefaultConfig/#237
=== CONT  TestDefaultConfig/#236
=== CONT  TestDefaultConfig/#235
=== CONT  TestDefaultConfig/#234
=== CONT  TestDefaultConfig/#233
=== CONT  TestDefaultConfig/#232
=== CONT  TestDefaultConfig/#102
=== CONT  TestDefaultConfig/#499
=== CONT  TestDefaultConfig/#498
=== CONT  TestDefaultConfig/#497
=== CONT  TestDefaultConfig/#496
=== CONT  TestDefaultConfig/#495
=== CONT  TestDefaultConfig/#494
=== CONT  TestDefaultConfig/#493
=== CONT  TestDefaultConfig/#492
=== CONT  TestDefaultConfig/#491
=== CONT  TestDefaultConfig/#490
=== CONT  TestDefaultConfig/#487
=== CONT  TestDefaultConfig/#486
=== CONT  TestDefaultConfig/#485
=== CONT  TestDefaultConfig/#484
=== CONT  TestDefaultConfig/#483
=== CONT  TestDefaultConfig/#482
=== CONT  TestDefaultConfig/#481
=== CONT  TestDefaultConfig/#480
=== CONT  TestDefaultConfig/#479
=== CONT  TestDefaultConfig/#478
=== CONT  TestDefaultConfig/#477
=== CONT  TestDefaultConfig/#476
=== CONT  TestDefaultConfig/#475
=== CONT  TestDefaultConfig/#473
=== CONT  TestDefaultConfig/#472
=== CONT  TestDefaultConfig/#471
=== CONT  TestDefaultConfig/#470
=== CONT  TestDefaultConfig/#469
=== CONT  TestDefaultConfig/#468
=== CONT  TestDefaultConfig/#467
=== CONT  TestDefaultConfig/#466
=== CONT  TestDefaultConfig/#465
=== CONT  TestDefaultConfig/#464
=== CONT  TestDefaultConfig/#463
=== CONT  TestDefaultConfig/#462
=== CONT  TestDefaultConfig/#461
=== CONT  TestDefaultConfig/#460
=== CONT  TestDefaultConfig/#459
=== CONT  TestDefaultConfig/#458
=== CONT  TestDefaultConfig/#457
=== CONT  TestDefaultConfig/#456
=== CONT  TestDefaultConfig/#455
=== CONT  TestDefaultConfig/#453
=== CONT  TestDefaultConfig/#454
=== CONT  TestDefaultConfig/#452
=== CONT  TestDefaultConfig/#451
=== CONT  TestDefaultConfig/#450
=== CONT  TestDefaultConfig/#449
=== CONT  TestDefaultConfig/#448
=== CONT  TestDefaultConfig/#447
=== CONT  TestDefaultConfig/#446
=== CONT  TestDefaultConfig/#445
=== CONT  TestDefaultConfig/#444
=== CONT  TestDefaultConfig/#443
=== CONT  TestDefaultConfig/#442
=== CONT  TestDefaultConfig/#441
=== CONT  TestDefaultConfig/#440
=== CONT  TestDefaultConfig/#439
=== CONT  TestDefaultConfig/#438
=== CONT  TestDefaultConfig/#437
=== CONT  TestDefaultConfig/#436
=== CONT  TestDefaultConfig/#435
=== CONT  TestDefaultConfig/#434
=== CONT  TestDefaultConfig/#433
=== CONT  TestDefaultConfig/#432
=== CONT  TestDefaultConfig/#431
=== CONT  TestDefaultConfig/#430
=== CONT  TestDefaultConfig/#429
=== CONT  TestDefaultConfig/#428
=== CONT  TestDefaultConfig/#427
=== CONT  TestDefaultConfig/#426
=== CONT  TestDefaultConfig/#425
=== CONT  TestDefaultConfig/#424
=== CONT  TestDefaultConfig/#423
=== CONT  TestDefaultConfig/#230
=== CONT  TestDefaultConfig/#229
=== CONT  TestDefaultConfig/#228
=== CONT  TestDefaultConfig/#227
=== CONT  TestDefaultConfig/#226
=== CONT  TestDefaultConfig/#225
=== CONT  TestDefaultConfig/#224
=== CONT  TestDefaultConfig/#223
=== CONT  TestDefaultConfig/#222
=== CONT  TestDefaultConfig/#221
=== CONT  TestDefaultConfig/#220
=== CONT  TestDefaultConfig/#219
=== CONT  TestDefaultConfig/#218
=== CONT  TestDefaultConfig/#217
=== CONT  TestDefaultConfig/#216
=== CONT  TestDefaultConfig/#215
=== CONT  TestDefaultConfig/#214
=== CONT  TestDefaultConfig/#213
=== CONT  TestDefaultConfig/#212
=== CONT  TestDefaultConfig/#211
=== CONT  TestDefaultConfig/#210
=== CONT  TestDefaultConfig/#209
=== CONT  TestDefaultConfig/#208
=== CONT  TestDefaultConfig/#207
=== CONT  TestDefaultConfig/#206
=== CONT  TestDefaultConfig/#205
=== CONT  TestDefaultConfig/#204
=== CONT  TestDefaultConfig/#203
=== CONT  TestDefaultConfig/#202
=== CONT  TestDefaultConfig/#201
=== CONT  TestDefaultConfig/#200
=== CONT  TestDefaultConfig/#199
=== CONT  TestDefaultConfig/#198
=== CONT  TestDefaultConfig/#197
=== CONT  TestDefaultConfig/#196
=== CONT  TestDefaultConfig/#195
=== CONT  TestDefaultConfig/#194
=== CONT  TestDefaultConfig/#193
=== CONT  TestDefaultConfig/#192
=== CONT  TestDefaultConfig/#191
=== CONT  TestDefaultConfig/#190
=== CONT  TestDefaultConfig/#189
=== CONT  TestDefaultConfig/#188
=== CONT  TestDefaultConfig/#187
=== CONT  TestDefaultConfig/#186
=== CONT  TestDefaultConfig/#185
=== CONT  TestDefaultConfig/#184
=== CONT  TestDefaultConfig/#183
=== CONT  TestDefaultConfig/#182
=== CONT  TestDefaultConfig/#181
=== CONT  TestDefaultConfig/#180
=== CONT  TestDefaultConfig/#179
=== CONT  TestDefaultConfig/#178
=== CONT  TestDefaultConfig/#177
=== CONT  TestDefaultConfig/#176
=== CONT  TestDefaultConfig/#175
=== CONT  TestDefaultConfig/#174
=== CONT  TestDefaultConfig/#173
=== CONT  TestDefaultConfig/#172
=== CONT  TestDefaultConfig/#171
=== CONT  TestDefaultConfig/#170
=== CONT  TestDefaultConfig/#169
=== CONT  TestDefaultConfig/#168
=== CONT  TestDefaultConfig/#167
=== CONT  TestDefaultConfig/#166
=== CONT  TestDefaultConfig/#165
=== CONT  TestDefaultConfig/#164
=== CONT  TestDefaultConfig/#163
=== CONT  TestDefaultConfig/#162
=== CONT  TestDefaultConfig/#161
=== CONT  TestDefaultConfig/#160
=== CONT  TestDefaultConfig/#159
=== CONT  TestDefaultConfig/#158
=== CONT  TestDefaultConfig/#157
=== CONT  TestDefaultConfig/#156
=== CONT  TestDefaultConfig/#155
=== CONT  TestDefaultConfig/#154
=== CONT  TestDefaultConfig/#153
=== CONT  TestDefaultConfig/#152
=== CONT  TestDefaultConfig/#151
=== CONT  TestDefaultConfig/#150
=== CONT  TestDefaultConfig/#149
=== CONT  TestDefaultConfig/#148
=== CONT  TestDefaultConfig/#147
=== CONT  TestDefaultConfig/#146
=== CONT  TestDefaultConfig/#145
=== CONT  TestDefaultConfig/#144
=== CONT  TestDefaultConfig/#143
=== CONT  TestDefaultConfig/#142
=== CONT  TestDefaultConfig/#141
=== CONT  TestDefaultConfig/#140
=== CONT  TestDefaultConfig/#139
=== CONT  TestDefaultConfig/#138
=== CONT  TestDefaultConfig/#137
=== CONT  TestDefaultConfig/#136
=== CONT  TestDefaultConfig/#135
=== CONT  TestDefaultConfig/#134
=== CONT  TestDefaultConfig/#133
=== CONT  TestDefaultConfig/#132
=== CONT  TestDefaultConfig/#131
=== CONT  TestDefaultConfig/#130
=== CONT  TestDefaultConfig/#129
=== CONT  TestDefaultConfig/#128
=== CONT  TestDefaultConfig/#127
=== CONT  TestDefaultConfig/#126
=== CONT  TestDefaultConfig/#125
=== CONT  TestDefaultConfig/#124
=== CONT  TestDefaultConfig/#123
=== CONT  TestDefaultConfig/#122
=== CONT  TestDefaultConfig/#121
=== CONT  TestDefaultConfig/#120
=== CONT  TestDefaultConfig/#119
=== CONT  TestDefaultConfig/#118
=== CONT  TestDefaultConfig/#117
=== CONT  TestDefaultConfig/#116
=== CONT  TestDefaultConfig/#115
=== CONT  TestDefaultConfig/#114
=== CONT  TestDefaultConfig/#113
=== CONT  TestDefaultConfig/#112
=== CONT  TestDefaultConfig/#111
=== CONT  TestDefaultConfig/#110
=== CONT  TestDefaultConfig/#109
=== CONT  TestDefaultConfig/#108
=== CONT  TestDefaultConfig/#107
=== CONT  TestDefaultConfig/#106
=== CONT  TestDefaultConfig/#105
=== CONT  TestDefaultConfig/#104
=== CONT  TestDefaultConfig/#103
=== CONT  TestDefaultConfig/#101
=== CONT  TestDefaultConfig/#100
=== CONT  TestDefaultConfig/#99
=== CONT  TestDefaultConfig/#98
=== CONT  TestDefaultConfig/#97
=== CONT  TestDefaultConfig/#96
=== CONT  TestDefaultConfig/#95
=== CONT  TestDefaultConfig/#94
=== CONT  TestDefaultConfig/#93
=== CONT  TestDefaultConfig/#92
=== CONT  TestDefaultConfig/#91
=== CONT  TestDefaultConfig/#90
=== CONT  TestDefaultConfig/#89
=== CONT  TestDefaultConfig/#88
=== CONT  TestDefaultConfig/#87
=== CONT  TestDefaultConfig/#86
=== CONT  TestDefaultConfig/#85
=== CONT  TestDefaultConfig/#84
=== CONT  TestDefaultConfig/#83
=== CONT  TestDefaultConfig/#82
=== CONT  TestDefaultConfig/#81
=== CONT  TestDefaultConfig/#80
=== CONT  TestDefaultConfig/#79
=== CONT  TestDefaultConfig/#78
=== CONT  TestDefaultConfig/#77
=== CONT  TestDefaultConfig/#76
=== CONT  TestDefaultConfig/#75
=== CONT  TestDefaultConfig/#74
=== CONT  TestDefaultConfig/#73
=== CONT  TestDefaultConfig/#72
=== CONT  TestDefaultConfig/#71
=== CONT  TestDefaultConfig/#70
=== CONT  TestDefaultConfig/#69
=== CONT  TestDefaultConfig/#68
=== CONT  TestDefaultConfig/#67
=== CONT  TestDefaultConfig/#66
=== CONT  TestDefaultConfig/#65
=== CONT  TestDefaultConfig/#64
=== CONT  TestDefaultConfig/#63
=== CONT  TestDefaultConfig/#62
=== CONT  TestDefaultConfig/#61
=== CONT  TestDefaultConfig/#60
=== CONT  TestDefaultConfig/#59
=== CONT  TestDefaultConfig/#58
=== CONT  TestDefaultConfig/#57
=== CONT  TestDefaultConfig/#56
=== CONT  TestDefaultConfig/#55
=== CONT  TestDefaultConfig/#54
=== CONT  TestDefaultConfig/#53
=== CONT  TestDefaultConfig/#52
=== CONT  TestDefaultConfig/#51
=== CONT  TestDefaultConfig/#50
=== CONT  TestDefaultConfig/#49
=== CONT  TestDefaultConfig/#48
=== CONT  TestDefaultConfig/#47
=== CONT  TestDefaultConfig/#46
=== CONT  TestDefaultConfig/#45
=== CONT  TestDefaultConfig/#44
=== CONT  TestDefaultConfig/#43
=== CONT  TestDefaultConfig/#42
=== CONT  TestDefaultConfig/#41
=== CONT  TestDefaultConfig/#40
=== CONT  TestDefaultConfig/#39
=== CONT  TestDefaultConfig/#38
=== CONT  TestDefaultConfig/#37
=== CONT  TestDefaultConfig/#36
=== CONT  TestDefaultConfig/#35
=== CONT  TestDefaultConfig/#34
=== CONT  TestDefaultConfig/#33
=== CONT  TestDefaultConfig/#32
=== CONT  TestDefaultConfig/#31
=== CONT  TestDefaultConfig/#30
=== CONT  TestDefaultConfig/#29
=== CONT  TestDefaultConfig/#28
=== CONT  TestDefaultConfig/#27
=== CONT  TestDefaultConfig/#26
=== CONT  TestDefaultConfig/#25
=== CONT  TestDefaultConfig/#24
=== CONT  TestDefaultConfig/#23
=== CONT  TestDefaultConfig/#22
=== CONT  TestDefaultConfig/#21
=== CONT  TestDefaultConfig/#20
=== CONT  TestDefaultConfig/#19
=== CONT  TestDefaultConfig/#18
=== CONT  TestDefaultConfig/#17
=== CONT  TestDefaultConfig/#16
=== CONT  TestDefaultConfig/#15
=== CONT  TestDefaultConfig/#14
=== CONT  TestDefaultConfig/#13
=== CONT  TestDefaultConfig/#12
=== CONT  TestDefaultConfig/#11
=== CONT  TestDefaultConfig/#10
=== CONT  TestDefaultConfig/#09
=== CONT  TestDefaultConfig/#08
=== CONT  TestDefaultConfig/#07
=== CONT  TestDefaultConfig/#06
=== CONT  TestDefaultConfig/#05
=== CONT  TestDefaultConfig/#04
=== CONT  TestDefaultConfig/#03
=== CONT  TestDefaultConfig/#02
=== CONT  TestDefaultConfig/#01
--- PASS: TestDefaultConfig (0.10s)
    --- PASS: TestDefaultConfig/#489 (0.20s)
    --- PASS: TestDefaultConfig/#488 (0.26s)
    --- PASS: TestDefaultConfig/#474 (0.30s)
    --- PASS: TestDefaultConfig/#00 (0.34s)
    --- PASS: TestDefaultConfig/#421 (0.14s)
    --- PASS: TestDefaultConfig/#422 (0.26s)
    --- PASS: TestDefaultConfig/#420 (0.16s)
    --- PASS: TestDefaultConfig/#419 (0.14s)
    --- PASS: TestDefaultConfig/#418 (0.16s)
    --- PASS: TestDefaultConfig/#417 (0.13s)
    --- PASS: TestDefaultConfig/#416 (0.14s)
    --- PASS: TestDefaultConfig/#415 (0.22s)
    --- PASS: TestDefaultConfig/#414 (0.19s)
    --- PASS: TestDefaultConfig/#412 (0.15s)
    --- PASS: TestDefaultConfig/#413 (0.22s)
    --- PASS: TestDefaultConfig/#411 (0.36s)
    --- PASS: TestDefaultConfig/#410 (0.35s)
    --- PASS: TestDefaultConfig/#409 (0.35s)
    --- PASS: TestDefaultConfig/#407 (0.10s)
    --- PASS: TestDefaultConfig/#408 (0.39s)
    --- PASS: TestDefaultConfig/#405 (0.13s)
    --- PASS: TestDefaultConfig/#404 (0.14s)
    --- PASS: TestDefaultConfig/#406 (0.21s)
    --- PASS: TestDefaultConfig/#403 (0.15s)
    --- PASS: TestDefaultConfig/#402 (0.19s)
    --- PASS: TestDefaultConfig/#401 (0.15s)
    --- PASS: TestDefaultConfig/#400 (0.17s)
    --- PASS: TestDefaultConfig/#399 (0.32s)
    --- PASS: TestDefaultConfig/#398 (0.28s)
    --- PASS: TestDefaultConfig/#397 (0.29s)
    --- PASS: TestDefaultConfig/#396 (0.34s)
    --- PASS: TestDefaultConfig/#395 (0.20s)
    --- PASS: TestDefaultConfig/#394 (0.21s)
    --- PASS: TestDefaultConfig/#392 (0.12s)
    --- PASS: TestDefaultConfig/#393 (0.21s)
    --- PASS: TestDefaultConfig/#391 (0.12s)
    --- PASS: TestDefaultConfig/#389 (0.12s)
    --- PASS: TestDefaultConfig/#388 (0.15s)
    --- PASS: TestDefaultConfig/#387 (0.12s)
    --- PASS: TestDefaultConfig/#390 (0.21s)
    --- PASS: TestDefaultConfig/#385 (0.10s)
    --- PASS: TestDefaultConfig/#386 (0.25s)
    --- PASS: TestDefaultConfig/#373 (0.26s)
    --- PASS: TestDefaultConfig/#384 (0.30s)
    --- PASS: TestDefaultConfig/#383 (0.29s)
    --- PASS: TestDefaultConfig/#380 (0.17s)
    --- PASS: TestDefaultConfig/#382 (0.33s)
    --- PASS: TestDefaultConfig/#381 (0.27s)
    --- PASS: TestDefaultConfig/#379 (0.20s)
    --- PASS: TestDefaultConfig/#378 (0.19s)
    --- PASS: TestDefaultConfig/#377 (0.15s)
    --- PASS: TestDefaultConfig/#376 (0.16s)
    --- PASS: TestDefaultConfig/#375 (0.20s)
    --- PASS: TestDefaultConfig/#374 (0.13s)
    --- PASS: TestDefaultConfig/#372 (0.19s)
    --- PASS: TestDefaultConfig/#371 (0.28s)
    --- PASS: TestDefaultConfig/#370 (0.36s)
    --- PASS: TestDefaultConfig/#369 (0.43s)
    --- PASS: TestDefaultConfig/#368 (0.45s)
    --- PASS: TestDefaultConfig/#365 (0.16s)
    --- PASS: TestDefaultConfig/#366 (0.24s)
    --- PASS: TestDefaultConfig/#367 (0.38s)
    --- PASS: TestDefaultConfig/#364 (0.19s)
    --- PASS: TestDefaultConfig/#362 (0.16s)
    --- PASS: TestDefaultConfig/#363 (0.19s)
    --- PASS: TestDefaultConfig/#361 (0.19s)
    --- PASS: TestDefaultConfig/#360 (0.16s)
    --- PASS: TestDefaultConfig/#359 (0.13s)
    --- PASS: TestDefaultConfig/#231 (0.14s)
    --- PASS: TestDefaultConfig/#358 (0.41s)
    --- PASS: TestDefaultConfig/#356 (0.48s)
    --- PASS: TestDefaultConfig/#355 (0.50s)
    --- PASS: TestDefaultConfig/#357 (0.58s)
    --- PASS: TestDefaultConfig/#354 (0.27s)
    --- PASS: TestDefaultConfig/#353 (0.20s)
    --- PASS: TestDefaultConfig/#352 (0.20s)
    --- PASS: TestDefaultConfig/#351 (0.18s)
    --- PASS: TestDefaultConfig/#350 (0.17s)
    --- PASS: TestDefaultConfig/#349 (0.21s)
    --- PASS: TestDefaultConfig/#346 (0.17s)
    --- PASS: TestDefaultConfig/#347 (0.20s)
    --- PASS: TestDefaultConfig/#348 (0.23s)
    --- PASS: TestDefaultConfig/#344 (0.25s)
    --- PASS: TestDefaultConfig/#345 (0.36s)
    --- PASS: TestDefaultConfig/#343 (0.44s)
    --- PASS: TestDefaultConfig/#342 (0.49s)
    --- PASS: TestDefaultConfig/#339 (0.18s)
    --- PASS: TestDefaultConfig/#341 (0.39s)
    --- PASS: TestDefaultConfig/#340 (0.32s)
    --- PASS: TestDefaultConfig/#338 (0.22s)
    --- PASS: TestDefaultConfig/#337 (0.16s)
    --- PASS: TestDefaultConfig/#336 (0.15s)
    --- PASS: TestDefaultConfig/#335 (0.17s)
    --- PASS: TestDefaultConfig/#332 (0.14s)
    --- PASS: TestDefaultConfig/#331 (0.16s)
    --- PASS: TestDefaultConfig/#333 (0.18s)
    --- PASS: TestDefaultConfig/#334 (0.33s)
    --- PASS: TestDefaultConfig/#329 (0.37s)
    --- PASS: TestDefaultConfig/#330 (0.41s)
    --- PASS: TestDefaultConfig/#328 (0.44s)
    --- PASS: TestDefaultConfig/#327 (0.39s)
    --- PASS: TestDefaultConfig/#325 (0.25s)
    --- PASS: TestDefaultConfig/#326 (0.28s)
    --- PASS: TestDefaultConfig/#308 (0.16s)
    --- PASS: TestDefaultConfig/#324 (0.22s)
    --- PASS: TestDefaultConfig/#323 (0.17s)
    --- PASS: TestDefaultConfig/#321 (0.16s)
    --- PASS: TestDefaultConfig/#322 (0.19s)
    --- PASS: TestDefaultConfig/#320 (0.18s)
    --- PASS: TestDefaultConfig/#319 (0.25s)
    --- PASS: TestDefaultConfig/#318 (0.27s)
    --- PASS: TestDefaultConfig/#316 (0.33s)
    --- PASS: TestDefaultConfig/#317 (0.37s)
    --- PASS: TestDefaultConfig/#313 (0.34s)
    --- PASS: TestDefaultConfig/#314 (0.43s)
    --- PASS: TestDefaultConfig/#312 (0.33s)
    --- PASS: TestDefaultConfig/#315 (0.51s)
    --- PASS: TestDefaultConfig/#311 (0.18s)
    --- PASS: TestDefaultConfig/#310 (0.18s)
    --- PASS: TestDefaultConfig/#309 (0.25s)
    --- PASS: TestDefaultConfig/#307 (0.24s)
    --- PASS: TestDefaultConfig/#305 (0.21s)
    --- PASS: TestDefaultConfig/#306 (0.24s)
    --- PASS: TestDefaultConfig/#304 (0.14s)
    --- PASS: TestDefaultConfig/#303 (0.23s)
    --- PASS: TestDefaultConfig/#302 (0.28s)
    --- PASS: TestDefaultConfig/#301 (0.44s)
    --- PASS: TestDefaultConfig/#299 (0.47s)
    --- PASS: TestDefaultConfig/#300 (0.58s)
    --- PASS: TestDefaultConfig/#298 (0.53s)
    --- PASS: TestDefaultConfig/#296 (0.29s)
    --- PASS: TestDefaultConfig/#297 (0.44s)
    --- PASS: TestDefaultConfig/#295 (0.29s)
    --- PASS: TestDefaultConfig/#294 (0.16s)
    --- PASS: TestDefaultConfig/#293 (0.16s)
    --- PASS: TestDefaultConfig/#291 (0.16s)
    --- PASS: TestDefaultConfig/#292 (0.18s)
    --- PASS: TestDefaultConfig/#290 (0.18s)
    --- PASS: TestDefaultConfig/#289 (0.19s)
    --- PASS: TestDefaultConfig/#287 (0.18s)
    --- PASS: TestDefaultConfig/#288 (0.23s)
    --- PASS: TestDefaultConfig/#286 (0.38s)
    --- PASS: TestDefaultConfig/#285 (0.32s)
    --- PASS: TestDefaultConfig/#284 (0.37s)
    --- PASS: TestDefaultConfig/#283 (0.38s)
    --- PASS: TestDefaultConfig/#281 (0.16s)
    --- PASS: TestDefaultConfig/#282 (0.22s)
    --- PASS: TestDefaultConfig/#279 (0.18s)
    --- PASS: TestDefaultConfig/#278 (0.18s)
    --- PASS: TestDefaultConfig/#280 (0.30s)
    --- PASS: TestDefaultConfig/#277 (0.20s)
    --- PASS: TestDefaultConfig/#276 (0.21s)
    --- PASS: TestDefaultConfig/#274 (0.15s)
    --- PASS: TestDefaultConfig/#275 (0.22s)
    --- PASS: TestDefaultConfig/#273 (0.31s)
    --- PASS: TestDefaultConfig/#272 (0.34s)
    --- PASS: TestDefaultConfig/#270 (0.30s)
    --- PASS: TestDefaultConfig/#271 (0.38s)
    --- PASS: TestDefaultConfig/#269 (0.19s)
    --- PASS: TestDefaultConfig/#268 (0.17s)
    --- PASS: TestDefaultConfig/#266 (0.14s)
    --- PASS: TestDefaultConfig/#267 (0.22s)
    --- PASS: TestDefaultConfig/#265 (0.21s)
    --- PASS: TestDefaultConfig/#263 (0.14s)
    --- PASS: TestDefaultConfig/#264 (0.20s)
    --- PASS: TestDefaultConfig/#258 (0.15s)
    --- PASS: TestDefaultConfig/#262 (0.24s)
    --- PASS: TestDefaultConfig/#261 (0.28s)
    --- PASS: TestDefaultConfig/#259 (0.24s)
    --- PASS: TestDefaultConfig/#260 (0.34s)
    --- PASS: TestDefaultConfig/#257 (0.25s)
    --- PASS: TestDefaultConfig/#255 (0.15s)
    --- PASS: TestDefaultConfig/#256 (0.21s)
    --- PASS: TestDefaultConfig/#254 (0.19s)
    --- PASS: TestDefaultConfig/#253 (0.19s)
    --- PASS: TestDefaultConfig/#252 (0.20s)
    --- PASS: TestDefaultConfig/#251 (0.20s)
    --- PASS: TestDefaultConfig/#250 (0.14s)
    --- PASS: TestDefaultConfig/#248 (0.14s)
    --- PASS: TestDefaultConfig/#246 (0.27s)
    --- PASS: TestDefaultConfig/#249 (0.38s)
    --- PASS: TestDefaultConfig/#247 (0.37s)
    --- PASS: TestDefaultConfig/#245 (0.34s)
    --- PASS: TestDefaultConfig/#243 (0.20s)
    --- PASS: TestDefaultConfig/#242 (0.21s)
    --- PASS: TestDefaultConfig/#244 (0.28s)
    --- PASS: TestDefaultConfig/#241 (0.21s)
    --- PASS: TestDefaultConfig/#240 (0.17s)
    --- PASS: TestDefaultConfig/#238 (0.16s)
    --- PASS: TestDefaultConfig/#239 (0.20s)
    --- PASS: TestDefaultConfig/#237 (0.14s)
    --- PASS: TestDefaultConfig/#236 (0.15s)
    --- PASS: TestDefaultConfig/#235 (0.18s)
    --- PASS: TestDefaultConfig/#233 (0.23s)
    --- PASS: TestDefaultConfig/#234 (0.39s)
    --- PASS: TestDefaultConfig/#232 (0.33s)
    --- PASS: TestDefaultConfig/#102 (0.27s)
    --- PASS: TestDefaultConfig/#499 (0.19s)
    --- PASS: TestDefaultConfig/#498 (0.14s)
    --- PASS: TestDefaultConfig/#495 (0.15s)
    --- PASS: TestDefaultConfig/#496 (0.19s)
    --- PASS: TestDefaultConfig/#497 (0.21s)
    --- PASS: TestDefaultConfig/#494 (0.20s)
    --- PASS: TestDefaultConfig/#493 (0.19s)
    --- PASS: TestDefaultConfig/#492 (0.20s)
    --- PASS: TestDefaultConfig/#491 (0.21s)
    --- PASS: TestDefaultConfig/#490 (0.28s)
    --- PASS: TestDefaultConfig/#487 (0.36s)
    --- PASS: TestDefaultConfig/#485 (0.35s)
    --- PASS: TestDefaultConfig/#486 (0.47s)
    --- PASS: TestDefaultConfig/#483 (0.19s)
    --- PASS: TestDefaultConfig/#484 (0.34s)
    --- PASS: TestDefaultConfig/#482 (0.19s)
    --- PASS: TestDefaultConfig/#481 (0.19s)
    --- PASS: TestDefaultConfig/#480 (0.21s)
    --- PASS: TestDefaultConfig/#479 (0.19s)
    --- PASS: TestDefaultConfig/#477 (0.15s)
    --- PASS: TestDefaultConfig/#478 (0.24s)
    --- PASS: TestDefaultConfig/#475 (0.16s)
    --- PASS: TestDefaultConfig/#476 (0.17s)
    --- PASS: TestDefaultConfig/#472 (0.27s)
    --- PASS: TestDefaultConfig/#473 (0.30s)
    --- PASS: TestDefaultConfig/#470 (0.43s)
    --- PASS: TestDefaultConfig/#471 (0.50s)
    --- PASS: TestDefaultConfig/#468 (0.33s)
    --- PASS: TestDefaultConfig/#469 (0.38s)
    --- PASS: TestDefaultConfig/#467 (0.22s)
    --- PASS: TestDefaultConfig/#465 (0.14s)
    --- PASS: TestDefaultConfig/#466 (0.20s)
    --- PASS: TestDefaultConfig/#464 (0.17s)
    --- PASS: TestDefaultConfig/#463 (0.18s)
    --- PASS: TestDefaultConfig/#461 (0.16s)
    --- PASS: TestDefaultConfig/#462 (0.18s)
    --- PASS: TestDefaultConfig/#460 (0.14s)
    --- PASS: TestDefaultConfig/#457 (0.32s)
    --- PASS: TestDefaultConfig/#459 (0.42s)
    --- PASS: TestDefaultConfig/#458 (0.40s)
    --- PASS: TestDefaultConfig/#456 (0.43s)
    --- PASS: TestDefaultConfig/#454 (0.23s)
    --- PASS: TestDefaultConfig/#452 (0.21s)
    --- PASS: TestDefaultConfig/#455 (0.28s)
    --- PASS: TestDefaultConfig/#453 (0.32s)
    --- PASS: TestDefaultConfig/#449 (0.19s)
    --- PASS: TestDefaultConfig/#451 (0.21s)
    --- PASS: TestDefaultConfig/#448 (0.19s)
    --- PASS: TestDefaultConfig/#450 (0.26s)
    --- PASS: TestDefaultConfig/#447 (0.20s)
    --- PASS: TestDefaultConfig/#446 (0.22s)
    --- PASS: TestDefaultConfig/#445 (0.21s)
    --- PASS: TestDefaultConfig/#444 (0.41s)
    --- PASS: TestDefaultConfig/#442 (0.36s)
    --- PASS: TestDefaultConfig/#443 (0.40s)
    --- PASS: TestDefaultConfig/#441 (0.40s)
    --- PASS: TestDefaultConfig/#440 (0.18s)
    --- PASS: TestDefaultConfig/#439 (0.16s)
    --- PASS: TestDefaultConfig/#438 (0.16s)
    --- PASS: TestDefaultConfig/#436 (0.17s)
    --- PASS: TestDefaultConfig/#435 (0.13s)
    --- PASS: TestDefaultConfig/#434 (0.14s)
    --- PASS: TestDefaultConfig/#437 (0.25s)
    --- PASS: TestDefaultConfig/#431 (0.10s)
    --- PASS: TestDefaultConfig/#432 (0.14s)
    --- PASS: TestDefaultConfig/#430 (0.25s)
    --- PASS: TestDefaultConfig/#433 (0.31s)
    --- PASS: TestDefaultConfig/#428 (0.22s)
    --- PASS: TestDefaultConfig/#429 (0.28s)
    --- PASS: TestDefaultConfig/#427 (0.19s)
    --- PASS: TestDefaultConfig/#426 (0.21s)
    --- PASS: TestDefaultConfig/#424 (0.16s)
    --- PASS: TestDefaultConfig/#425 (0.24s)
    --- PASS: TestDefaultConfig/#423 (0.18s)
    --- PASS: TestDefaultConfig/#230 (0.14s)
    --- PASS: TestDefaultConfig/#229 (0.19s)
    --- PASS: TestDefaultConfig/#228 (0.16s)
    --- PASS: TestDefaultConfig/#227 (0.13s)
    --- PASS: TestDefaultConfig/#225 (0.23s)
    --- PASS: TestDefaultConfig/#226 (0.30s)
    --- PASS: TestDefaultConfig/#224 (0.30s)
    --- PASS: TestDefaultConfig/#223 (0.33s)
    --- PASS: TestDefaultConfig/#222 (0.18s)
    --- PASS: TestDefaultConfig/#221 (0.21s)
    --- PASS: TestDefaultConfig/#218 (0.13s)
    --- PASS: TestDefaultConfig/#220 (0.21s)
    --- PASS: TestDefaultConfig/#219 (0.18s)
    --- PASS: TestDefaultConfig/#217 (0.17s)
    --- PASS: TestDefaultConfig/#216 (0.12s)
    --- PASS: TestDefaultConfig/#215 (0.17s)
    --- PASS: TestDefaultConfig/#214 (0.16s)
    --- PASS: TestDefaultConfig/#213 (0.14s)
    --- PASS: TestDefaultConfig/#212 (0.24s)
    --- PASS: TestDefaultConfig/#210 (0.31s)
    --- PASS: TestDefaultConfig/#211 (0.36s)
    --- PASS: TestDefaultConfig/#208 (0.32s)
    --- PASS: TestDefaultConfig/#209 (0.45s)
    --- PASS: TestDefaultConfig/#207 (0.20s)
    --- PASS: TestDefaultConfig/#206 (0.24s)
    --- PASS: TestDefaultConfig/#204 (0.15s)
    --- PASS: TestDefaultConfig/#205 (0.17s)
    --- PASS: TestDefaultConfig/#203 (0.19s)
    --- PASS: TestDefaultConfig/#202 (0.16s)
    --- PASS: TestDefaultConfig/#200 (0.15s)
    --- PASS: TestDefaultConfig/#201 (0.20s)
    --- PASS: TestDefaultConfig/#199 (0.34s)
    --- PASS: TestDefaultConfig/#198 (0.31s)
    --- PASS: TestDefaultConfig/#197 (0.34s)
    --- PASS: TestDefaultConfig/#196 (0.37s)
    --- PASS: TestDefaultConfig/#193 (0.18s)
    --- PASS: TestDefaultConfig/#194 (0.29s)
    --- PASS: TestDefaultConfig/#195 (0.34s)
    --- PASS: TestDefaultConfig/#192 (0.24s)
    --- PASS: TestDefaultConfig/#191 (0.15s)
    --- PASS: TestDefaultConfig/#189 (0.17s)
    --- PASS: TestDefaultConfig/#190 (0.21s)
    --- PASS: TestDefaultConfig/#188 (0.13s)
    --- PASS: TestDefaultConfig/#187 (0.18s)
    --- PASS: TestDefaultConfig/#186 (0.11s)
    --- PASS: TestDefaultConfig/#185 (0.20s)
    --- PASS: TestDefaultConfig/#184 (0.23s)
    --- PASS: TestDefaultConfig/#182 (0.30s)
    --- PASS: TestDefaultConfig/#183 (0.31s)
    --- PASS: TestDefaultConfig/#181 (0.34s)
    --- PASS: TestDefaultConfig/#180 (0.38s)
    --- PASS: TestDefaultConfig/#178 (0.26s)
    --- PASS: TestDefaultConfig/#179 (0.32s)
    --- PASS: TestDefaultConfig/#177 (0.15s)
    --- PASS: TestDefaultConfig/#175 (0.17s)
    --- PASS: TestDefaultConfig/#176 (0.18s)
    --- PASS: TestDefaultConfig/#173 (0.15s)
    --- PASS: TestDefaultConfig/#174 (0.18s)
    --- PASS: TestDefaultConfig/#172 (0.17s)
    --- PASS: TestDefaultConfig/#171 (0.21s)
    --- PASS: TestDefaultConfig/#169 (0.16s)
    --- PASS: TestDefaultConfig/#170 (0.30s)
    --- PASS: TestDefaultConfig/#168 (0.41s)
    --- PASS: TestDefaultConfig/#166 (0.39s)
    --- PASS: TestDefaultConfig/#167 (0.41s)
    --- PASS: TestDefaultConfig/#165 (0.29s)
    --- PASS: TestDefaultConfig/#164 (0.19s)
    --- PASS: TestDefaultConfig/#162 (0.16s)
    --- PASS: TestDefaultConfig/#161 (0.17s)
    --- PASS: TestDefaultConfig/#163 (0.23s)
    --- PASS: TestDefaultConfig/#159 (0.16s)
    --- PASS: TestDefaultConfig/#158 (0.17s)
    --- PASS: TestDefaultConfig/#160 (0.21s)
    --- PASS: TestDefaultConfig/#157 (0.20s)
    --- PASS: TestDefaultConfig/#156 (0.18s)
    --- PASS: TestDefaultConfig/#154 (0.29s)
    --- PASS: TestDefaultConfig/#155 (0.33s)
    --- PASS: TestDefaultConfig/#152 (0.21s)
    --- PASS: TestDefaultConfig/#153 (0.34s)
    --- PASS: TestDefaultConfig/#150 (0.15s)
    --- PASS: TestDefaultConfig/#151 (0.20s)
    --- PASS: TestDefaultConfig/#148 (0.15s)
    --- PASS: TestDefaultConfig/#149 (0.20s)
    --- PASS: TestDefaultConfig/#146 (0.15s)
    --- PASS: TestDefaultConfig/#147 (0.18s)
    --- PASS: TestDefaultConfig/#145 (0.16s)
    --- PASS: TestDefaultConfig/#144 (0.18s)
    --- PASS: TestDefaultConfig/#143 (0.17s)
    --- PASS: TestDefaultConfig/#142 (0.25s)
    --- PASS: TestDefaultConfig/#141 (0.28s)
    --- PASS: TestDefaultConfig/#140 (0.28s)
    --- PASS: TestDefaultConfig/#138 (0.18s)
    --- PASS: TestDefaultConfig/#139 (0.35s)
    --- PASS: TestDefaultConfig/#136 (0.19s)
    --- PASS: TestDefaultConfig/#135 (0.14s)
    --- PASS: TestDefaultConfig/#137 (0.25s)
    --- PASS: TestDefaultConfig/#134 (0.21s)
    --- PASS: TestDefaultConfig/#133 (0.19s)
    --- PASS: TestDefaultConfig/#131 (0.15s)
    --- PASS: TestDefaultConfig/#132 (0.19s)
    --- PASS: TestDefaultConfig/#130 (0.29s)
    --- PASS: TestDefaultConfig/#127 (0.30s)
    --- PASS: TestDefaultConfig/#129 (0.37s)
    --- PASS: TestDefaultConfig/#128 (0.48s)
    --- PASS: TestDefaultConfig/#126 (0.29s)
    --- PASS: TestDefaultConfig/#125 (0.29s)
    --- PASS: TestDefaultConfig/#124 (0.27s)
    --- PASS: TestDefaultConfig/#123 (0.20s)
    --- PASS: TestDefaultConfig/#122 (0.17s)
    --- PASS: TestDefaultConfig/#121 (0.17s)
    --- PASS: TestDefaultConfig/#119 (0.13s)
    --- PASS: TestDefaultConfig/#118 (0.10s)
    --- PASS: TestDefaultConfig/#120 (0.18s)
    --- PASS: TestDefaultConfig/#117 (0.16s)
    --- PASS: TestDefaultConfig/#116 (0.14s)
    --- PASS: TestDefaultConfig/#115 (0.14s)
    --- PASS: TestDefaultConfig/#114 (0.24s)
    --- PASS: TestDefaultConfig/#113 (0.32s)
    --- PASS: TestDefaultConfig/#111 (0.29s)
    --- PASS: TestDefaultConfig/#110 (0.25s)
    --- PASS: TestDefaultConfig/#112 (0.45s)
    --- PASS: TestDefaultConfig/#109 (0.19s)
    --- PASS: TestDefaultConfig/#108 (0.19s)
    --- PASS: TestDefaultConfig/#107 (0.16s)
    --- PASS: TestDefaultConfig/#106 (0.14s)
    --- PASS: TestDefaultConfig/#105 (0.13s)
    --- PASS: TestDefaultConfig/#104 (0.12s)
    --- PASS: TestDefaultConfig/#103 (0.11s)
    --- PASS: TestDefaultConfig/#100 (0.27s)
    --- PASS: TestDefaultConfig/#101 (0.29s)
    --- PASS: TestDefaultConfig/#98 (0.34s)
    --- PASS: TestDefaultConfig/#99 (0.39s)
    --- PASS: TestDefaultConfig/#97 (0.24s)
    --- PASS: TestDefaultConfig/#96 (0.25s)
    --- PASS: TestDefaultConfig/#94 (0.15s)
    --- PASS: TestDefaultConfig/#95 (0.24s)
    --- PASS: TestDefaultConfig/#93 (0.22s)
    --- PASS: TestDefaultConfig/#92 (0.22s)
    --- PASS: TestDefaultConfig/#91 (0.19s)
    --- PASS: TestDefaultConfig/#90 (0.21s)
    --- PASS: TestDefaultConfig/#89 (0.18s)
    --- PASS: TestDefaultConfig/#88 (0.19s)
    --- PASS: TestDefaultConfig/#87 (0.38s)
    --- PASS: TestDefaultConfig/#86 (0.39s)
    --- PASS: TestDefaultConfig/#84 (0.42s)
    --- PASS: TestDefaultConfig/#85 (0.49s)
    --- PASS: TestDefaultConfig/#83 (0.29s)
    --- PASS: TestDefaultConfig/#82 (0.25s)
    --- PASS: TestDefaultConfig/#81 (0.14s)
    --- PASS: TestDefaultConfig/#80 (0.15s)
    --- PASS: TestDefaultConfig/#78 (0.14s)
    --- PASS: TestDefaultConfig/#79 (0.20s)
    --- PASS: TestDefaultConfig/#77 (0.17s)
    --- PASS: TestDefaultConfig/#76 (0.20s)
    --- PASS: TestDefaultConfig/#74 (0.16s)
    --- PASS: TestDefaultConfig/#75 (0.18s)
    --- PASS: TestDefaultConfig/#73 (0.30s)
    --- PASS: TestDefaultConfig/#70 (0.36s)
    --- PASS: TestDefaultConfig/#72 (0.45s)
    --- PASS: TestDefaultConfig/#71 (0.41s)
    --- PASS: TestDefaultConfig/#69 (0.29s)
    --- PASS: TestDefaultConfig/#68 (0.21s)
    --- PASS: TestDefaultConfig/#67 (0.20s)
    --- PASS: TestDefaultConfig/#66 (0.21s)
    --- PASS: TestDefaultConfig/#65 (0.19s)
    --- PASS: TestDefaultConfig/#64 (0.13s)
    --- PASS: TestDefaultConfig/#63 (0.17s)
    --- PASS: TestDefaultConfig/#61 (0.14s)
    --- PASS: TestDefaultConfig/#62 (0.20s)
    --- PASS: TestDefaultConfig/#60 (0.18s)
    --- PASS: TestDefaultConfig/#57 (0.34s)
    --- PASS: TestDefaultConfig/#58 (0.41s)
    --- PASS: TestDefaultConfig/#59 (0.43s)
    --- PASS: TestDefaultConfig/#56 (0.36s)
    --- PASS: TestDefaultConfig/#55 (0.18s)
    --- PASS: TestDefaultConfig/#54 (0.21s)
    --- PASS: TestDefaultConfig/#53 (0.21s)
    --- PASS: TestDefaultConfig/#52 (0.15s)
    --- PASS: TestDefaultConfig/#49 (0.16s)
    --- PASS: TestDefaultConfig/#50 (0.17s)
    --- PASS: TestDefaultConfig/#51 (0.24s)
    --- PASS: TestDefaultConfig/#48 (0.17s)
    --- PASS: TestDefaultConfig/#45 (0.14s)
    --- PASS: TestDefaultConfig/#47 (0.23s)
    --- PASS: TestDefaultConfig/#46 (0.28s)
    --- PASS: TestDefaultConfig/#44 (0.31s)
    --- PASS: TestDefaultConfig/#42 (0.26s)
    --- PASS: TestDefaultConfig/#41 (0.20s)
    --- PASS: TestDefaultConfig/#43 (0.33s)
    --- PASS: TestDefaultConfig/#40 (0.18s)
    --- PASS: TestDefaultConfig/#37 (0.13s)
    --- PASS: TestDefaultConfig/#39 (0.17s)
    --- PASS: TestDefaultConfig/#36 (0.15s)
    --- PASS: TestDefaultConfig/#38 (0.20s)
    --- PASS: TestDefaultConfig/#35 (0.17s)
    --- PASS: TestDefaultConfig/#33 (0.18s)
    --- PASS: TestDefaultConfig/#34 (0.22s)
    --- PASS: TestDefaultConfig/#32 (0.20s)
    --- PASS: TestDefaultConfig/#31 (0.29s)
    --- PASS: TestDefaultConfig/#29 (0.41s)
    --- PASS: TestDefaultConfig/#28 (0.49s)
    --- PASS: TestDefaultConfig/#30 (0.54s)
    --- PASS: TestDefaultConfig/#27 (0.36s)
    --- PASS: TestDefaultConfig/#26 (0.24s)
    --- PASS: TestDefaultConfig/#25 (0.22s)
    --- PASS: TestDefaultConfig/#24 (0.21s)
    --- PASS: TestDefaultConfig/#23 (0.23s)
    --- PASS: TestDefaultConfig/#22 (0.21s)
    --- PASS: TestDefaultConfig/#21 (0.21s)
    --- PASS: TestDefaultConfig/#20 (0.26s)
    --- PASS: TestDefaultConfig/#19 (0.18s)
    --- PASS: TestDefaultConfig/#18 (0.31s)
    --- PASS: TestDefaultConfig/#17 (0.30s)
    --- PASS: TestDefaultConfig/#15 (0.47s)
    --- PASS: TestDefaultConfig/#16 (0.54s)
    --- PASS: TestDefaultConfig/#13 (0.32s)
    --- PASS: TestDefaultConfig/#14 (0.50s)
    --- PASS: TestDefaultConfig/#12 (0.27s)
    --- PASS: TestDefaultConfig/#11 (0.22s)
    --- PASS: TestDefaultConfig/#10 (0.20s)
    --- PASS: TestDefaultConfig/#09 (0.18s)
    --- PASS: TestDefaultConfig/#07 (0.17s)
    --- PASS: TestDefaultConfig/#08 (0.18s)
    --- PASS: TestDefaultConfig/#06 (0.21s)
    --- PASS: TestDefaultConfig/#05 (0.22s)
    --- PASS: TestDefaultConfig/#04 (0.21s)
    --- PASS: TestDefaultConfig/#03 (0.29s)
    --- PASS: TestDefaultConfig/#02 (0.36s)
    --- PASS: TestDefaultConfig/#01 (0.33s)
=== RUN   TestTxnEndpoint_Bad_JSON
=== PAUSE TestTxnEndpoint_Bad_JSON
=== RUN   TestTxnEndpoint_Bad_Size_Item
=== PAUSE TestTxnEndpoint_Bad_Size_Item
=== RUN   TestTxnEndpoint_Bad_Size_Net
=== PAUSE TestTxnEndpoint_Bad_Size_Net
=== RUN   TestTxnEndpoint_Bad_Size_Ops
=== PAUSE TestTxnEndpoint_Bad_Size_Ops
=== RUN   TestTxnEndpoint_KV_Actions
=== PAUSE TestTxnEndpoint_KV_Actions
=== RUN   TestTxnEndpoint_UpdateCheck
=== PAUSE TestTxnEndpoint_UpdateCheck
=== RUN   TestUiIndex
=== PAUSE TestUiIndex
=== RUN   TestUiNodes
=== PAUSE TestUiNodes
=== RUN   TestUiNodes_Filter
=== PAUSE TestUiNodes_Filter
=== RUN   TestUiNodeInfo
=== PAUSE TestUiNodeInfo
=== RUN   TestUiServices
=== PAUSE TestUiServices
=== RUN   TestValidateUserEventParams
=== PAUSE TestValidateUserEventParams
=== RUN   TestShouldProcessUserEvent
=== PAUSE TestShouldProcessUserEvent
=== RUN   TestIngestUserEvent
=== PAUSE TestIngestUserEvent
=== RUN   TestFireReceiveEvent
=== PAUSE TestFireReceiveEvent
=== RUN   TestUserEventToken
=== PAUSE TestUserEventToken
=== RUN   TestStringHash
=== PAUSE TestStringHash
=== RUN   TestSetFilePermissions
=== PAUSE TestSetFilePermissions
=== RUN   TestDurationFixer
--- PASS: TestDurationFixer (0.00s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== RUN   TestForwardSignals
=== RUN   TestForwardSignals/signal-interrupt
=== RUN   TestForwardSignals/signal-terminated
--- PASS: TestForwardSignals (1.28s)
    --- PASS: TestForwardSignals/signal-interrupt (0.78s)
    --- PASS: TestForwardSignals/signal-terminated (0.50s)
=== RUN   TestMakeWatchHandler
=== PAUSE TestMakeWatchHandler
=== RUN   TestMakeHTTPWatchHandler
=== PAUSE TestMakeHTTPWatchHandler
=== CONT  TestACL_Legacy_Disabled_Response
=== CONT  TestMakeHTTPWatchHandler
=== CONT  TestKVSEndpoint_AcquireRelease
=== CONT  TestKVSEndpoint_GET_Raw
2019/12/06 06:03:28 [TRACE] agent: http watch handler 'http://127.0.0.1:42807' output: Ok, i see
--- PASS: TestMakeHTTPWatchHandler (0.01s)
=== CONT  TestHTTPServer_UnixSocket
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:28.178211 [WARN] agent: Node name "Node f5e03ea8-dfa9-3089-fc1d-472260237d87" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:28.178608 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:28.180691 [WARN] agent: Node name "Node 285e4e50-7ab2-a42c-c769-d17aa094a17a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:28.181212 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:28.188635 [WARN] agent: Node name "Node 5f8034c6-a546-5aca-018e-bb291677d649" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:28.189232 [DEBUG] tlsutil: Update with version 1
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:28.192981 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:28.193662 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:28.204811 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPServer_UnixSocket - 2019/12/06 06:03:28.228839 [WARN] agent: Node name "Node 524e350a-f016-ac9a-17a5-5feb62ec35e3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPServer_UnixSocket - 2019/12/06 06:03:28.229524 [DEBUG] tlsutil: Update with version 1
TestHTTPServer_UnixSocket - 2019/12/06 06:03:28.231956 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f5e03ea8-dfa9-3089-fc1d-472260237d87 Address:127.0.0.1:34258}]
2019/12/06 06:03:29 [INFO]  raft: Node at 127.0.0.1:34258 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:29.385219 [INFO] serf: EventMemberJoin: Node f5e03ea8-dfa9-3089-fc1d-472260237d87.dc1 127.0.0.1
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:29.388699 [INFO] serf: EventMemberJoin: Node f5e03ea8-dfa9-3089-fc1d-472260237d87 127.0.0.1
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:29.390151 [INFO] consul: Adding LAN server Node f5e03ea8-dfa9-3089-fc1d-472260237d87 (Addr: tcp/127.0.0.1:34258) (DC: dc1)
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:29.390344 [INFO] consul: Handled member-join event for server "Node f5e03ea8-dfa9-3089-fc1d-472260237d87.dc1" in area "wan"
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:29.392116 [INFO] agent: Started DNS server 127.0.0.1:34253 (tcp)
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:29.392187 [INFO] agent: Started DNS server 127.0.0.1:34253 (udp)
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:29.395905 [INFO] agent: Started HTTP server on 127.0.0.1:34254 (tcp)
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:29.396011 [INFO] agent: started state syncer
2019/12/06 06:03:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:29 [INFO]  raft: Node at 127.0.0.1:34258 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:285e4e50-7ab2-a42c-c769-d17aa094a17a Address:127.0.0.1:34252}]
2019/12/06 06:03:29 [INFO]  raft: Node at 127.0.0.1:34252 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:29.506928 [INFO] serf: EventMemberJoin: Node 285e4e50-7ab2-a42c-c769-d17aa094a17a.dc1 127.0.0.1
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:29.510529 [INFO] serf: EventMemberJoin: Node 285e4e50-7ab2-a42c-c769-d17aa094a17a 127.0.0.1
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:29.511416 [INFO] consul: Handled member-join event for server "Node 285e4e50-7ab2-a42c-c769-d17aa094a17a.dc1" in area "wan"
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:29.511748 [INFO] consul: Adding LAN server Node 285e4e50-7ab2-a42c-c769-d17aa094a17a (Addr: tcp/127.0.0.1:34252) (DC: dc1)
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:29.512220 [INFO] agent: Started DNS server 127.0.0.1:34247 (udp)
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:29.512418 [INFO] agent: Started DNS server 127.0.0.1:34247 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:29.515090 [INFO] agent: Started HTTP server on 127.0.0.1:34248 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:29.515231 [INFO] agent: started state syncer
2019/12/06 06:03:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:29 [INFO]  raft: Node at 127.0.0.1:34252 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5f8034c6-a546-5aca-018e-bb291677d649 Address:127.0.0.1:34246}]
2019/12/06 06:03:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:524e350a-f016-ac9a-17a5-5feb62ec35e3 Address:127.0.0.1:34264}]
TestHTTPServer_UnixSocket - 2019/12/06 06:03:29.623322 [INFO] serf: EventMemberJoin: Node 524e350a-f016-ac9a-17a5-5feb62ec35e3.dc1 127.0.0.1
2019/12/06 06:03:29 [INFO]  raft: Node at 127.0.0.1:34264 [Follower] entering Follower state (Leader: "")
2019/12/06 06:03:29 [INFO]  raft: Node at 127.0.0.1:34246 [Follower] entering Follower state (Leader: "")
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:29.627264 [INFO] serf: EventMemberJoin: Node 5f8034c6-a546-5aca-018e-bb291677d649.dc1 127.0.0.1
TestHTTPServer_UnixSocket - 2019/12/06 06:03:29.628734 [INFO] serf: EventMemberJoin: Node 524e350a-f016-ac9a-17a5-5feb62ec35e3 127.0.0.1
TestHTTPServer_UnixSocket - 2019/12/06 06:03:29.629483 [INFO] consul: Handled member-join event for server "Node 524e350a-f016-ac9a-17a5-5feb62ec35e3.dc1" in area "wan"
TestHTTPServer_UnixSocket - 2019/12/06 06:03:29.629851 [INFO] consul: Adding LAN server Node 524e350a-f016-ac9a-17a5-5feb62ec35e3 (Addr: tcp/127.0.0.1:34264) (DC: dc1)
TestHTTPServer_UnixSocket - 2019/12/06 06:03:29.630143 [INFO] agent: Started DNS server 127.0.0.1:34259 (udp)
TestHTTPServer_UnixSocket - 2019/12/06 06:03:29.630473 [INFO] agent: Started DNS server 127.0.0.1:34259 (tcp)
TestHTTPServer_UnixSocket - 2019/12/06 06:03:29.633333 [INFO] agent: Started HTTP server on /tmp/consul-test/TestHTTPServer_UnixSocket-consul358413100/test.sock (unix)
TestHTTPServer_UnixSocket - 2019/12/06 06:03:29.633436 [INFO] agent: started state syncer
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:29.637344 [INFO] serf: EventMemberJoin: Node 5f8034c6-a546-5aca-018e-bb291677d649 127.0.0.1
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:29.638958 [INFO] consul: Handled member-join event for server "Node 5f8034c6-a546-5aca-018e-bb291677d649.dc1" in area "wan"
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:29.639354 [INFO] consul: Adding LAN server Node 5f8034c6-a546-5aca-018e-bb291677d649 (Addr: tcp/127.0.0.1:34246) (DC: dc1)
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:29.640692 [INFO] agent: Started DNS server 127.0.0.1:34241 (tcp)
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:29.641174 [INFO] agent: Started DNS server 127.0.0.1:34241 (udp)
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:29.643657 [INFO] agent: Started HTTP server on 127.0.0.1:34242 (tcp)
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:29.643741 [INFO] agent: started state syncer
2019/12/06 06:03:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:29 [INFO]  raft: Node at 127.0.0.1:34246 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:29 [INFO]  raft: Node at 127.0.0.1:34264 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:30 [INFO]  raft: Node at 127.0.0.1:34252 [Leader] entering Leader state
2019/12/06 06:03:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:30 [INFO]  raft: Node at 127.0.0.1:34258 [Leader] entering Leader state
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:30.303571 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:30.304066 [INFO] consul: New leader elected: Node 285e4e50-7ab2-a42c-c769-d17aa094a17a
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:30.305600 [INFO] consul: cluster leadership acquired
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:30.306096 [INFO] consul: New leader elected: Node f5e03ea8-dfa9-3089-fc1d-472260237d87
2019/12/06 06:03:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:30 [INFO]  raft: Node at 127.0.0.1:34246 [Leader] entering Leader state
2019/12/06 06:03:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:30 [INFO]  raft: Node at 127.0.0.1:34264 [Leader] entering Leader state
TestHTTPServer_UnixSocket - 2019/12/06 06:03:30.430999 [INFO] consul: cluster leadership acquired
TestHTTPServer_UnixSocket - 2019/12/06 06:03:30.431451 [INFO] consul: New leader elected: Node 524e350a-f016-ac9a-17a5-5feb62ec35e3
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:30.431751 [INFO] consul: cluster leadership acquired
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:30.432173 [INFO] consul: New leader elected: Node 5f8034c6-a546-5aca-018e-bb291677d649
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:30.785662 [INFO] agent: Synced node info
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:30.785767 [DEBUG] agent: Node info in sync
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:30.786043 [INFO] agent: Synced node info
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:30.788585 [INFO] agent: Requesting shutdown
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:30.788683 [INFO] consul: shutting down server
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:30.788731 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/12/06 06:03:30.913161 [INFO] agent: Synced node info
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:30.924375 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/12/06 06:03:30.929891 [DEBUG] http: Request GET /v1/agent/self (265.267478ms) from=@
TestHTTPServer_UnixSocket - 2019/12/06 06:03:30.944918 [INFO] agent: Requesting shutdown
TestHTTPServer_UnixSocket - 2019/12/06 06:03:30.945258 [INFO] consul: shutting down server
TestHTTPServer_UnixSocket - 2019/12/06 06:03:30.945437 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.013344 [INFO] manager: shutting down
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.118110 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.198002 [INFO] manager: shutting down
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.409862 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.410251 [INFO] agent: consul server down
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.410304 [INFO] agent: shutdown complete
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.410370 [INFO] agent: Stopping DNS server 127.0.0.1:34259 (tcp)
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.410532 [INFO] agent: Stopping DNS server 127.0.0.1:34259 (udp)
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.410702 [INFO] agent: Stopping HTTP server /tmp/consul-test/TestHTTPServer_UnixSocket-consul358413100/test.sock (unix)
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.411844 [INFO] agent: Waiting for endpoints to shut down
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.412081 [INFO] agent: Endpoints down
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.412559 [ERR] consul: failed to establish leadership: leadership lost while committing log
--- PASS: TestHTTPServer_UnixSocket (3.33s)
=== CONT  TestFilterNonPassing
--- PASS: TestFilterNonPassing (0.00s)
=== CONT  TestHealthConnectServiceNodes_PassingFilter
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.418636 [INFO] agent: consul server down
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.418830 [INFO] agent: shutdown complete
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.418986 [INFO] agent: Stopping DNS server 127.0.0.1:34253 (tcp)
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.419270 [INFO] agent: Stopping DNS server 127.0.0.1:34253 (udp)
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.419540 [INFO] agent: Stopping HTTP server 127.0.0.1:34254 (tcp)
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.419846 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_GET_Raw - 2019/12/06 06:03:31.420033 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_GET_Raw (3.34s)
=== CONT  TestHealthConnectServiceNodes_Filter
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:31.424411 [INFO] agent: Synced node info
=== RUN   TestACL_Legacy_Disabled_Response/0
=== RUN   TestACL_Legacy_Disabled_Response/1
=== RUN   TestACL_Legacy_Disabled_Response/2
=== RUN   TestACL_Legacy_Disabled_Response/3
=== RUN   TestACL_Legacy_Disabled_Response/4
=== RUN   TestACL_Legacy_Disabled_Response/5
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:31.426756 [INFO] agent: Requesting shutdown
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:31.426830 [INFO] consul: shutting down server
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:31.426873 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.430016 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestHTTPServer_UnixSocket - 2019/12/06 06:03:31.430210 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:31.586039 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:31.668129 [WARN] agent: Node name "Node e4a39e12-f72d-2739-84d6-48bac2620e89" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:31.668760 [DEBUG] tlsutil: Update with version 1
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:31.672288 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:31.724358 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:31.781533 [WARN] agent: Node name "Node 027ef2f1-9d88-dfcb-186e-64aff56ba964" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:31.785135 [DEBUG] tlsutil: Update with version 1
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:31.800808 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.002680 [INFO] agent: consul server down
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.002748 [INFO] agent: shutdown complete
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.002806 [INFO] agent: Stopping DNS server 127.0.0.1:34241 (tcp)
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.002938 [INFO] agent: Stopping DNS server 127.0.0.1:34241 (udp)
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.003082 [INFO] agent: Stopping HTTP server 127.0.0.1:34242 (tcp)
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.003273 [INFO] agent: Waiting for endpoints to shut down
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.003341 [INFO] agent: Endpoints down
--- PASS: TestACL_Legacy_Disabled_Response (3.93s)
    --- PASS: TestACL_Legacy_Disabled_Response/0 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/1 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/2 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/3 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/4 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/5 (0.00s)
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.003524 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
=== CONT  TestHealthConnectServiceNodes
TestACL_Legacy_Disabled_Response - 2019/12/06 06:03:32.004352 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestHealthConnectServiceNodes - 2019/12/06 06:03:32.061546 [WARN] agent: Node name "Node c19bace4-5a37-d387-f1ac-d339869e8533" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthConnectServiceNodes - 2019/12/06 06:03:32.062010 [DEBUG] tlsutil: Update with version 1
TestHealthConnectServiceNodes - 2019/12/06 06:03:32.064275 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:32.195550 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:32.196057 [DEBUG] consul: Skipping self join check for "Node 285e4e50-7ab2-a42c-c769-d17aa094a17a" since the cluster is too small
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:32.196219 [INFO] consul: member 'Node 285e4e50-7ab2-a42c-c769-d17aa094a17a' joined, marking health alive
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:32.281233 [DEBUG] agent: Node info in sync
2019/12/06 06:03:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e4a39e12-f72d-2739-84d6-48bac2620e89 Address:127.0.0.1:34270}]
2019/12/06 06:03:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:027ef2f1-9d88-dfcb-186e-64aff56ba964 Address:127.0.0.1:34276}]
2019/12/06 06:03:32 [INFO]  raft: Node at 127.0.0.1:34270 [Follower] entering Follower state (Leader: "")
2019/12/06 06:03:32 [INFO]  raft: Node at 127.0.0.1:34276 [Follower] entering Follower state (Leader: "")
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:32.912792 [INFO] serf: EventMemberJoin: Node e4a39e12-f72d-2739-84d6-48bac2620e89.dc1 127.0.0.1
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:32.913665 [INFO] serf: EventMemberJoin: Node 027ef2f1-9d88-dfcb-186e-64aff56ba964.dc1 127.0.0.1
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:32.923029 [INFO] serf: EventMemberJoin: Node e4a39e12-f72d-2739-84d6-48bac2620e89 127.0.0.1
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:32.925026 [INFO] consul: Adding LAN server Node e4a39e12-f72d-2739-84d6-48bac2620e89 (Addr: tcp/127.0.0.1:34270) (DC: dc1)
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:32.929042 [INFO] serf: EventMemberJoin: Node 027ef2f1-9d88-dfcb-186e-64aff56ba964 127.0.0.1
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:32.931967 [INFO] agent: Started DNS server 127.0.0.1:34265 (udp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:32.932232 [INFO] consul: Handled member-join event for server "Node e4a39e12-f72d-2739-84d6-48bac2620e89.dc1" in area "wan"
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:32.932799 [INFO] agent: Started DNS server 127.0.0.1:34265 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:32.935984 [INFO] consul: Adding LAN server Node 027ef2f1-9d88-dfcb-186e-64aff56ba964 (Addr: tcp/127.0.0.1:34276) (DC: dc1)
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:32.935989 [INFO] consul: Handled member-join event for server "Node 027ef2f1-9d88-dfcb-186e-64aff56ba964.dc1" in area "wan"
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:32.941384 [INFO] agent: Started DNS server 127.0.0.1:34271 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:32.941904 [INFO] agent: Started DNS server 127.0.0.1:34271 (udp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:32.945728 [INFO] agent: Started HTTP server on 127.0.0.1:34266 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:32.946063 [INFO] agent: started state syncer
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:32.950432 [INFO] agent: Started HTTP server on 127.0.0.1:34272 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:32.951259 [INFO] agent: started state syncer
2019/12/06 06:03:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:32 [INFO]  raft: Node at 127.0.0.1:34276 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:32 [INFO]  raft: Node at 127.0.0.1:34270 [Candidate] entering Candidate state in term 2
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.169343 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.537132 [INFO] agent: Requesting shutdown
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.537226 [INFO] consul: shutting down server
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.537271 [WARN] serf: Shutdown without a Leave
2019/12/06 06:03:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c19bace4-5a37-d387-f1ac-d339869e8533 Address:127.0.0.1:34282}]
2019/12/06 06:03:33 [INFO]  raft: Node at 127.0.0.1:34282 [Follower] entering Follower state (Leader: "")
TestHealthConnectServiceNodes - 2019/12/06 06:03:33.574164 [INFO] serf: EventMemberJoin: Node c19bace4-5a37-d387-f1ac-d339869e8533.dc1 127.0.0.1
TestHealthConnectServiceNodes - 2019/12/06 06:03:33.586318 [INFO] serf: EventMemberJoin: Node c19bace4-5a37-d387-f1ac-d339869e8533 127.0.0.1
TestHealthConnectServiceNodes - 2019/12/06 06:03:33.587435 [INFO] consul: Handled member-join event for server "Node c19bace4-5a37-d387-f1ac-d339869e8533.dc1" in area "wan"
TestHealthConnectServiceNodes - 2019/12/06 06:03:33.587495 [INFO] consul: Adding LAN server Node c19bace4-5a37-d387-f1ac-d339869e8533 (Addr: tcp/127.0.0.1:34282) (DC: dc1)
TestHealthConnectServiceNodes - 2019/12/06 06:03:33.589022 [INFO] agent: Started DNS server 127.0.0.1:34277 (tcp)
TestHealthConnectServiceNodes - 2019/12/06 06:03:33.592920 [INFO] agent: Started DNS server 127.0.0.1:34277 (udp)
TestHealthConnectServiceNodes - 2019/12/06 06:03:33.595331 [INFO] agent: Started HTTP server on 127.0.0.1:34278 (tcp)
TestHealthConnectServiceNodes - 2019/12/06 06:03:33.595421 [INFO] agent: started state syncer
2019/12/06 06:03:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:33 [INFO]  raft: Node at 127.0.0.1:34282 [Candidate] entering Candidate state in term 2
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.726511 [WARN] serf: Shutdown without a Leave
2019/12/06 06:03:33 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:33 [INFO]  raft: Node at 127.0.0.1:34276 [Leader] entering Leader state
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:33.728535 [INFO] consul: cluster leadership acquired
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:33.728972 [INFO] consul: New leader elected: Node 027ef2f1-9d88-dfcb-186e-64aff56ba964
2019/12/06 06:03:33 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:33 [INFO]  raft: Node at 127.0.0.1:34270 [Leader] entering Leader state
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.828891 [INFO] manager: shutting down
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.829713 [INFO] agent: consul server down
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.829769 [INFO] agent: shutdown complete
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.829840 [INFO] agent: Stopping DNS server 127.0.0.1:34247 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.829980 [INFO] agent: Stopping DNS server 127.0.0.1:34247 (udp)
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.830129 [INFO] agent: Stopping HTTP server 127.0.0.1:34248 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.830320 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_AcquireRelease - 2019/12/06 06:03:33.830384 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_AcquireRelease (5.76s)
=== CONT  TestHealthServiceNodes_WanTranslation
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:33.835093 [INFO] consul: cluster leadership acquired
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:33.835500 [INFO] consul: New leader elected: Node e4a39e12-f72d-2739-84d6-48bac2620e89
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:33.918829 [WARN] agent: Node name "Node 54629263-dee8-02b1-6f86-cfd8821ec6bf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:33.919404 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:33.921858 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:34.331371 [INFO] agent: Synced node info
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:34.331501 [DEBUG] agent: Node info in sync
=== RUN   TestHealthConnectServiceNodes_PassingFilter/bc_no_query_value
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.596909 [INFO] agent: Synced node info
2019/12/06 06:03:34 [INFO]  raft: Election won. Tally: 1
=== RUN   TestHealthConnectServiceNodes_PassingFilter/passing_true
2019/12/06 06:03:34 [INFO]  raft: Node at 127.0.0.1:34282 [Leader] entering Leader state
=== RUN   TestHealthConnectServiceNodes_PassingFilter/passing_false
=== RUN   TestHealthConnectServiceNodes_PassingFilter/passing_bad
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.600581 [INFO] agent: Requesting shutdown
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.600794 [INFO] consul: shutting down server
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.600942 [WARN] serf: Shutdown without a Leave
jones - 2019/12/06 06:03:34.612504 [DEBUG] consul: Skipping self join check for "Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2" since the cluster is too small
TestHealthConnectServiceNodes - 2019/12/06 06:03:34.616044 [INFO] consul: cluster leadership acquired
TestHealthConnectServiceNodes - 2019/12/06 06:03:34.616437 [INFO] consul: New leader elected: Node c19bace4-5a37-d387-f1ac-d339869e8533
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.718172 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.826657 [INFO] manager: shutting down
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.918618 [INFO] agent: consul server down
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.918719 [INFO] agent: shutdown complete
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.918815 [INFO] agent: Stopping DNS server 127.0.0.1:34265 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.919001 [INFO] agent: Stopping DNS server 127.0.0.1:34265 (udp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.919319 [INFO] agent: Stopping HTTP server 127.0.0.1:34266 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.919580 [INFO] agent: Waiting for endpoints to shut down
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.919665 [INFO] agent: Endpoints down
--- PASS: TestHealthConnectServiceNodes_PassingFilter (3.51s)
    --- PASS: TestHealthConnectServiceNodes_PassingFilter/bc_no_query_value (0.00s)
    --- PASS: TestHealthConnectServiceNodes_PassingFilter/passing_true (0.00s)
    --- PASS: TestHealthConnectServiceNodes_PassingFilter/passing_false (0.00s)
    --- PASS: TestHealthConnectServiceNodes_PassingFilter/passing_bad (0.00s)
=== CONT  TestHealthServiceNodes_DistanceSort
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.930138 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestHealthConnectServiceNodes_PassingFilter - 2019/12/06 06:03:34.930516 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:35.038661 [WARN] agent: Node name "Node 5fd5edb1-a39b-b2f1-d07c-19d2c014098e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:35.039256 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:35.043953 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.135698 [INFO] agent: Synced node info
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.135820 [DEBUG] agent: Node info in sync
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.136521 [INFO] agent: Requesting shutdown
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.136600 [INFO] consul: shutting down server
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.136643 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.228116 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.353158 [INFO] manager: shutting down
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.485706 [INFO] agent: consul server down
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.485783 [INFO] agent: shutdown complete
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.485849 [INFO] agent: Stopping DNS server 127.0.0.1:34277 (tcp)
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.486012 [INFO] agent: Stopping DNS server 127.0.0.1:34277 (udp)
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.486189 [INFO] agent: Stopping HTTP server 127.0.0.1:34278 (tcp)
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.486409 [INFO] agent: Waiting for endpoints to shut down
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.486483 [INFO] agent: Endpoints down
2019/12/06 06:03:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:54629263-dee8-02b1-6f86-cfd8821ec6bf Address:127.0.0.1:34288}]
--- PASS: TestHealthConnectServiceNodes (3.48s)
=== CONT  TestHealthServiceNodes_NodeMetaFilter
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:34288 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:35.496310 [INFO] serf: EventMemberJoin: Node 54629263-dee8-02b1-6f86-cfd8821ec6bf.dc1 127.0.0.1
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.497741 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestHealthConnectServiceNodes - 2019/12/06 06:03:35.498014 [ERR] consul: failed to establish leadership: raft is already shutdown
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:35.510818 [INFO] serf: EventMemberJoin: Node 54629263-dee8-02b1-6f86-cfd8821ec6bf 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:35.513231 [INFO] consul: Adding LAN server Node 54629263-dee8-02b1-6f86-cfd8821ec6bf (Addr: tcp/127.0.0.1:34288) (DC: dc1)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:35.514163 [INFO] consul: Handled member-join event for server "Node 54629263-dee8-02b1-6f86-cfd8821ec6bf.dc1" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:35.517705 [INFO] agent: Started DNS server 127.0.0.1:34283 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:35.518219 [INFO] agent: Started DNS server 127.0.0.1:34283 (udp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:35.532635 [INFO] agent: Started HTTP server on 127.0.0.1:34284 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:35.535462 [INFO] agent: started state syncer
2019/12/06 06:03:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:34288 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:35.594987 [WARN] agent: Node name "Node a28c7be9-2fb2-bb32-a227-b193b1a1112f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:35.595407 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:35.597517 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.663216 [INFO] agent: Requesting shutdown
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.663679 [INFO] consul: shutting down server
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.664285 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.868249 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.969332 [INFO] manager: shutting down
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.971615 [INFO] agent: consul server down
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.971681 [INFO] agent: shutdown complete
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.971652 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.971734 [INFO] agent: Stopping DNS server 127.0.0.1:34271 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.971849 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.971906 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.971921 [INFO] agent: Stopping DNS server 127.0.0.1:34271 (udp)
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.972106 [INFO] agent: Stopping HTTP server 127.0.0.1:34272 (tcp)
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.972329 [INFO] agent: Waiting for endpoints to shut down
TestHealthConnectServiceNodes_Filter - 2019/12/06 06:03:35.972401 [INFO] agent: Endpoints down
--- PASS: TestHealthConnectServiceNodes_Filter (4.55s)
=== CONT  TestHealthServiceNodes
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes - 2019/12/06 06:03:36.054503 [WARN] agent: Node name "Node d5ba7f96-83a6-470e-44d1-e86d35158c12" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes - 2019/12/06 06:03:36.055019 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes - 2019/12/06 06:03:36.057243 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:36 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5fd5edb1-a39b-b2f1-d07c-19d2c014098e Address:127.0.0.1:34294}]
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:34294 [Follower] entering Follower state (Leader: "")
2019/12/06 06:03:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:34288 [Leader] entering Leader state
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:36.367841 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:36.368282 [INFO] consul: New leader elected: Node 54629263-dee8-02b1-6f86-cfd8821ec6bf
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:36.373411 [INFO] serf: EventMemberJoin: Node 5fd5edb1-a39b-b2f1-d07c-19d2c014098e.dc1 127.0.0.1
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:36.387496 [INFO] serf: EventMemberJoin: Node 5fd5edb1-a39b-b2f1-d07c-19d2c014098e 127.0.0.1
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:36.389910 [INFO] consul: Adding LAN server Node 5fd5edb1-a39b-b2f1-d07c-19d2c014098e (Addr: tcp/127.0.0.1:34294) (DC: dc1)
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:36.390790 [INFO] consul: Handled member-join event for server "Node 5fd5edb1-a39b-b2f1-d07c-19d2c014098e.dc1" in area "wan"
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:36.394672 [INFO] agent: Started DNS server 127.0.0.1:34289 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:36.394924 [INFO] agent: Started DNS server 127.0.0.1:34289 (udp)
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:36.398597 [INFO] agent: Started HTTP server on 127.0.0.1:34290 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:36.398692 [INFO] agent: started state syncer
2019/12/06 06:03:36 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:34294 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:36.426316 [ERR] agent: failed to sync remote state: ACL not found
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:36.818787 [INFO] acl: initializing acls
2019/12/06 06:03:36 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a28c7be9-2fb2-bb32-a227-b193b1a1112f Address:127.0.0.1:34300}]
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:34300 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:36.955330 [INFO] serf: EventMemberJoin: Node a28c7be9-2fb2-bb32-a227-b193b1a1112f.dc1 127.0.0.1
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:36.959296 [INFO] serf: EventMemberJoin: Node a28c7be9-2fb2-bb32-a227-b193b1a1112f 127.0.0.1
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:36.960322 [INFO] consul: Handled member-join event for server "Node a28c7be9-2fb2-bb32-a227-b193b1a1112f.dc1" in area "wan"
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:36.960714 [INFO] consul: Adding LAN server Node a28c7be9-2fb2-bb32-a227-b193b1a1112f (Addr: tcp/127.0.0.1:34300) (DC: dc1)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:36.960812 [INFO] agent: Started DNS server 127.0.0.1:34295 (udp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:36.961208 [INFO] agent: Started DNS server 127.0.0.1:34295 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:36.963816 [INFO] agent: Started HTTP server on 127.0.0.1:34296 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:36.963970 [INFO] agent: started state syncer
2019/12/06 06:03:36 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:34300 [Candidate] entering Candidate state in term 2
jones - 2019/12/06 06:03:37.064503 [DEBUG] consul: Skipping self join check for "Node 69e40561-243a-545f-340c-f7edd80028d7" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.065889 [INFO] consul: Created ACL 'global-management' policy
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.072346 [INFO] acl: initializing acls
2019/12/06 06:03:37 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:37 [INFO]  raft: Node at 127.0.0.1:34294 [Leader] entering Leader state
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:37.167618 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:37.168165 [INFO] consul: New leader elected: Node 5fd5edb1-a39b-b2f1-d07c-19d2c014098e
2019/12/06 06:03:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d5ba7f96-83a6-470e-44d1-e86d35158c12 Address:127.0.0.1:34306}]
2019/12/06 06:03:37 [INFO]  raft: Node at 127.0.0.1:34306 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes - 2019/12/06 06:03:37.264963 [INFO] serf: EventMemberJoin: Node d5ba7f96-83a6-470e-44d1-e86d35158c12.dc1 127.0.0.1
TestHealthServiceNodes - 2019/12/06 06:03:37.280638 [INFO] serf: EventMemberJoin: Node d5ba7f96-83a6-470e-44d1-e86d35158c12 127.0.0.1
TestHealthServiceNodes - 2019/12/06 06:03:37.282431 [INFO] consul: Adding LAN server Node d5ba7f96-83a6-470e-44d1-e86d35158c12 (Addr: tcp/127.0.0.1:34306) (DC: dc1)
TestHealthServiceNodes - 2019/12/06 06:03:37.283311 [INFO] consul: Handled member-join event for server "Node d5ba7f96-83a6-470e-44d1-e86d35158c12.dc1" in area "wan"
TestHealthServiceNodes - 2019/12/06 06:03:37.285149 [INFO] agent: Started DNS server 127.0.0.1:34301 (tcp)
TestHealthServiceNodes - 2019/12/06 06:03:37.286143 [INFO] agent: Started DNS server 127.0.0.1:34301 (udp)
TestHealthServiceNodes - 2019/12/06 06:03:37.290246 [INFO] agent: Started HTTP server on 127.0.0.1:34302 (tcp)
TestHealthServiceNodes - 2019/12/06 06:03:37.290364 [INFO] agent: started state syncer
2019/12/06 06:03:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:37 [INFO]  raft: Node at 127.0.0.1:34306 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.895176 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.895598 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.895657 [DEBUG] acl: transitioning out of legacy ACL mode
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.896122 [INFO] serf: EventMemberUpdate: Node 54629263-dee8-02b1-6f86-cfd8821ec6bf
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.896893 [INFO] serf: EventMemberUpdate: Node 54629263-dee8-02b1-6f86-cfd8821ec6bf.dc1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.900751 [INFO] serf: EventMemberUpdate: Node 54629263-dee8-02b1-6f86-cfd8821ec6bf
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:37.901528 [INFO] serf: EventMemberUpdate: Node 54629263-dee8-02b1-6f86-cfd8821ec6bf.dc1
2019/12/06 06:03:37 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:37 [INFO]  raft: Node at 127.0.0.1:34300 [Leader] entering Leader state
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:38.004333 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:38.005434 [INFO] consul: New leader elected: Node a28c7be9-2fb2-bb32-a227-b193b1a1112f
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:38.106694 [INFO] agent: Synced node info
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:38.106833 [DEBUG] agent: Node info in sync
2019/12/06 06:03:38 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:38 [INFO]  raft: Node at 127.0.0.1:34306 [Leader] entering Leader state
TestHealthServiceNodes - 2019/12/06 06:03:38.308609 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes - 2019/12/06 06:03:38.309261 [INFO] consul: New leader elected: Node d5ba7f96-83a6-470e-44d1-e86d35158c12
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:38.378209 [INFO] agent: Synced node info
TestHealthServiceNodes - 2019/12/06 06:03:39.166253 [INFO] agent: Synced node info
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.280783 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.281088 [INFO] consul: shutting down server
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.281199 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.385529 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.394987 [INFO] agent: Synced node info
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.395195 [DEBUG] agent: Node info in sync
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.490608 [WARN] agent: Node name "Node 9979c85c-53bd-4f35-b984-b7e3770ea730" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.491290 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.493513 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.593288 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.768957 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.769481 [DEBUG] consul: Skipping self join check for "Node 54629263-dee8-02b1-6f86-cfd8821ec6bf" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.769672 [INFO] consul: member 'Node 54629263-dee8-02b1-6f86-cfd8821ec6bf' joined, marking health alive
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.774977 [INFO] agent: consul server down
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.775063 [INFO] agent: shutdown complete
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.775127 [INFO] agent: Stopping DNS server 127.0.0.1:34295 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.775291 [INFO] agent: Stopping DNS server 127.0.0.1:34295 (udp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.775469 [INFO] agent: Stopping HTTP server 127.0.0.1:34296 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.775734 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.775814 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_NodeMetaFilter (4.29s)
=== CONT  TestHealthServiceChecks_DistanceSort
TestHealthServiceNodes_NodeMetaFilter - 2019/12/06 06:03:39.776857 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:39.857560 [WARN] agent: Node name "Node 66796bd2-0fcf-2e17-020b-677578e3e865" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:39.858103 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:39.860704 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.990466 [DEBUG] consul: Skipping self join check for "Node 54629263-dee8-02b1-6f86-cfd8821ec6bf" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:39.991145 [DEBUG] consul: Skipping self join check for "Node 54629263-dee8-02b1-6f86-cfd8821ec6bf" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.108025 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.109792 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.109875 [INFO] consul: shutting down server
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.109921 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.245543 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.268401 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.387638 [INFO] manager: shutting down
TestHealthServiceNodes - 2019/12/06 06:03:40.540898 [DEBUG] agent: Node info in sync
TestHealthServiceNodes - 2019/12/06 06:03:40.541004 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:03:40.545233 [DEBUG] consul: Skipping self join check for "Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141" since the cluster is too small
TestHealthServiceNodes - 2019/12/06 06:03:40.629223 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes - 2019/12/06 06:03:40.629798 [DEBUG] consul: Skipping self join check for "Node d5ba7f96-83a6-470e-44d1-e86d35158c12" since the cluster is too small
TestHealthServiceNodes - 2019/12/06 06:03:40.629977 [INFO] consul: member 'Node d5ba7f96-83a6-470e-44d1-e86d35158c12' joined, marking health alive
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.638413 [INFO] agent: consul server down
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.638493 [INFO] agent: shutdown complete
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.638559 [INFO] agent: Stopping DNS server 127.0.0.1:34289 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.638726 [INFO] agent: Stopping DNS server 127.0.0.1:34289 (udp)
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.638899 [INFO] agent: Stopping HTTP server 127.0.0.1:34290 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.639109 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.639257 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_DistanceSort (5.72s)
=== CONT  TestHealthServiceChecks_Filtering
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.667943 [ERR] connect: Apply failed leadership lost while committing log
TestHealthServiceNodes_DistanceSort - 2019/12/06 06:03:40.668021 [ERR] consul: failed to establish leadership: leadership lost while committing log
2019/12/06 06:03:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9979c85c-53bd-4f35-b984-b7e3770ea730 Address:127.0.0.1:34312}]
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.748751 [INFO] serf: EventMemberJoin: Node 9979c85c-53bd-4f35-b984-b7e3770ea730.dc2 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.751993 [INFO] serf: EventMemberJoin: Node 9979c85c-53bd-4f35-b984-b7e3770ea730 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.753157 [INFO] agent: Started DNS server 127.0.0.1:34307 (udp)
2019/12/06 06:03:40 [INFO]  raft: Node at 127.0.0.1:34312 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.755232 [INFO] consul: Adding LAN server Node 9979c85c-53bd-4f35-b984-b7e3770ea730 (Addr: tcp/127.0.0.1:34312) (DC: dc2)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.755441 [INFO] consul: Handled member-join event for server "Node 9979c85c-53bd-4f35-b984-b7e3770ea730.dc2" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.755916 [INFO] agent: Started DNS server 127.0.0.1:34307 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.758296 [INFO] agent: Started HTTP server on 127.0.0.1:34308 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:40.758376 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:40.775452 [WARN] agent: Node name "Node 4c155c66-fba5-cf3e-c44a-03835e7f2a29" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:40.776152 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:40.779100 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:40 [INFO]  raft: Node at 127.0.0.1:34312 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes - 2019/12/06 06:03:41.442051 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:66796bd2-0fcf-2e17-020b-677578e3e865 Address:127.0.0.1:34318}]
2019/12/06 06:03:41 [INFO]  raft: Node at 127.0.0.1:34318 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:41.462999 [INFO] serf: EventMemberJoin: Node 66796bd2-0fcf-2e17-020b-677578e3e865.dc1 127.0.0.1
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:41.480796 [INFO] serf: EventMemberJoin: Node 66796bd2-0fcf-2e17-020b-677578e3e865 127.0.0.1
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:41.482215 [INFO] consul: Handled member-join event for server "Node 66796bd2-0fcf-2e17-020b-677578e3e865.dc1" in area "wan"
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:41.482314 [INFO] consul: Adding LAN server Node 66796bd2-0fcf-2e17-020b-677578e3e865 (Addr: tcp/127.0.0.1:34318) (DC: dc1)
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:41.483236 [INFO] agent: Started DNS server 127.0.0.1:34313 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:41.483675 [INFO] agent: Started DNS server 127.0.0.1:34313 (udp)
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:41.486219 [INFO] agent: Started HTTP server on 127.0.0.1:34314 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:41.486329 [INFO] agent: started state syncer
2019/12/06 06:03:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:41 [INFO]  raft: Node at 127.0.0.1:34318 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes - 2019/12/06 06:03:42.379915 [INFO] agent: Requesting shutdown
TestHealthServiceNodes - 2019/12/06 06:03:42.380012 [INFO] consul: shutting down server
TestHealthServiceNodes - 2019/12/06 06:03:42.380065 [WARN] serf: Shutdown without a Leave
jones - 2019/12/06 06:03:42.481184 [DEBUG] consul: Skipping self join check for "Node 7d555e7b-b226-ede2-fc13-7639f5dd2636" since the cluster is too small
TestHealthServiceNodes - 2019/12/06 06:03:42.561411 [WARN] serf: Shutdown without a Leave
2019/12/06 06:03:42 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:42 [INFO]  raft: Node at 127.0.0.1:34312 [Leader] entering Leader state
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:42.711422 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:42.711867 [INFO] consul: New leader elected: Node 9979c85c-53bd-4f35-b984-b7e3770ea730
TestHealthServiceNodes - 2019/12/06 06:03:42.818808 [INFO] manager: shutting down
TestHealthServiceNodes - 2019/12/06 06:03:42.820306 [INFO] agent: consul server down
TestHealthServiceNodes - 2019/12/06 06:03:42.820399 [INFO] agent: shutdown complete
TestHealthServiceNodes - 2019/12/06 06:03:42.820480 [INFO] agent: Stopping DNS server 127.0.0.1:34301 (tcp)
TestHealthServiceNodes - 2019/12/06 06:03:42.820741 [INFO] agent: Stopping DNS server 127.0.0.1:34301 (udp)
TestHealthServiceNodes - 2019/12/06 06:03:42.820950 [INFO] agent: Stopping HTTP server 127.0.0.1:34302 (tcp)
TestHealthServiceNodes - 2019/12/06 06:03:42.821189 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes - 2019/12/06 06:03:42.821257 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes (6.85s)
=== CONT  TestHealthServiceChecks_NodeMetaFilter
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:42.884544 [ERR] agent: failed to sync remote state: ACL not found
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:42.897620 [WARN] agent: Node name "Node e4f02f99-6fb6-5cd9-f923-2f93c9256b75" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:42.898200 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:42.900686 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:43 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:43 [INFO]  raft: Node at 127.0.0.1:34318 [Leader] entering Leader state
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:43.003483 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:43.003989 [INFO] consul: New leader elected: Node 66796bd2-0fcf-2e17-020b-677578e3e865
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:43.210680 [INFO] acl: initializing acls
2019/12/06 06:03:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4c155c66-fba5-cf3e-c44a-03835e7f2a29 Address:127.0.0.1:34324}]
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:43.215620 [INFO] serf: EventMemberJoin: Node 4c155c66-fba5-cf3e-c44a-03835e7f2a29.dc1 127.0.0.1
2019/12/06 06:03:43 [INFO]  raft: Node at 127.0.0.1:34324 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:43.221644 [INFO] serf: EventMemberJoin: Node 4c155c66-fba5-cf3e-c44a-03835e7f2a29 127.0.0.1
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:43.223423 [INFO] consul: Adding LAN server Node 4c155c66-fba5-cf3e-c44a-03835e7f2a29 (Addr: tcp/127.0.0.1:34324) (DC: dc1)
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:43.224106 [INFO] consul: Handled member-join event for server "Node 4c155c66-fba5-cf3e-c44a-03835e7f2a29.dc1" in area "wan"
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:43.225914 [INFO] agent: Started DNS server 127.0.0.1:34319 (udp)
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:43.226881 [INFO] agent: Started DNS server 127.0.0.1:34319 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:43.230209 [ERR] agent: failed to sync remote state: ACL not found
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:43.230584 [INFO] agent: Started HTTP server on 127.0.0.1:34320 (tcp)
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:43.230771 [INFO] agent: started state syncer
2019/12/06 06:03:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:43 [INFO]  raft: Node at 127.0.0.1:34324 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:43.494450 [INFO] agent: Synced node info
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:43.570830 [INFO] consul: Created ACL 'global-management' policy
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:43.821624 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:43.822950 [INFO] serf: EventMemberUpdate: Node 9979c85c-53bd-4f35-b984-b7e3770ea730
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:43.823709 [INFO] serf: EventMemberUpdate: Node 9979c85c-53bd-4f35-b984-b7e3770ea730.dc2
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:44.332811 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:44.332909 [DEBUG] agent: Node info in sync
2019/12/06 06:03:44 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:44 [INFO]  raft: Node at 127.0.0.1:34324 [Leader] entering Leader state
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:44.449679 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:44.450044 [INFO] consul: New leader elected: Node 4c155c66-fba5-cf3e-c44a-03835e7f2a29
2019/12/06 06:03:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e4f02f99-6fb6-5cd9-f923-2f93c9256b75 Address:127.0.0.1:34330}]
2019/12/06 06:03:44 [INFO]  raft: Node at 127.0.0.1:34330 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:44.607109 [INFO] serf: EventMemberJoin: Node e4f02f99-6fb6-5cd9-f923-2f93c9256b75.dc1 127.0.0.1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:44.622080 [INFO] serf: EventMemberJoin: Node e4f02f99-6fb6-5cd9-f923-2f93c9256b75 127.0.0.1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:44.622986 [INFO] consul: Adding LAN server Node e4f02f99-6fb6-5cd9-f923-2f93c9256b75 (Addr: tcp/127.0.0.1:34330) (DC: dc1)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:44.623565 [INFO] consul: Handled member-join event for server "Node e4f02f99-6fb6-5cd9-f923-2f93c9256b75.dc1" in area "wan"
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:44.624908 [INFO] agent: Started DNS server 127.0.0.1:34325 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:44.625177 [INFO] agent: Started DNS server 127.0.0.1:34325 (udp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:44.627638 [INFO] agent: Started HTTP server on 127.0.0.1:34326 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:44.627748 [INFO] agent: started state syncer
2019/12/06 06:03:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:44 [INFO]  raft: Node at 127.0.0.1:34330 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:44.933679 [INFO] agent: Synced node info
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:44.933797 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:03:45.172939 [DEBUG] consul: Skipping self join check for "Node a587e71c-195b-52ca-e2b1-0bac5467c444" since the cluster is too small
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:45.177549 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:45.178024 [DEBUG] consul: Skipping self join check for "Node 66796bd2-0fcf-2e17-020b-677578e3e865" since the cluster is too small
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:45.178170 [INFO] consul: member 'Node 66796bd2-0fcf-2e17-020b-677578e3e865' joined, marking health alive
2019/12/06 06:03:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:45 [INFO]  raft: Node at 127.0.0.1:34330 [Leader] entering Leader state
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:45.523713 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:45.524137 [INFO] consul: New leader elected: Node e4f02f99-6fb6-5cd9-f923-2f93c9256b75
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:45.745231 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:45.745923 [DEBUG] consul: Skipping self join check for "Node 9979c85c-53bd-4f35-b984-b7e3770ea730" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:45.746059 [INFO] consul: member 'Node 9979c85c-53bd-4f35-b984-b7e3770ea730' joined, marking health alive
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:46.177156 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:46.319446 [INFO] agent: Synced node info
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.321137 [DEBUG] consul: Skipping self join check for "Node 9979c85c-53bd-4f35-b984-b7e3770ea730" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.330322 [INFO] agent: (WAN) joining: [127.0.0.1:34287]
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.334966 [DEBUG] memberlist: Stream connection from=127.0.0.1:51646
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.335267 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:34287
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.341043 [INFO] serf: EventMemberJoin: Node 9979c85c-53bd-4f35-b984-b7e3770ea730.dc2 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.341565 [INFO] consul: Handled member-join event for server "Node 9979c85c-53bd-4f35-b984-b7e3770ea730.dc2" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.344414 [INFO] serf: EventMemberJoin: Node 54629263-dee8-02b1-6f86-cfd8821ec6bf.dc1 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.345001 [INFO] consul: Handled member-join event for server "Node 54629263-dee8-02b1-6f86-cfd8821ec6bf.dc1" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.345290 [INFO] agent: (WAN) joined: 1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.643755 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.645898 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.653066 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.653175 [INFO] consul: shutting down server
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.653242 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.735032 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:46.755965 [INFO] agent: Requesting shutdown
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:46.756060 [INFO] consul: shutting down server
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:46.756106 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.844984 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.845083 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.845885 [INFO] agent: consul server down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.845954 [INFO] agent: shutdown complete
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.846029 [INFO] agent: Stopping DNS server 127.0.0.1:34307 (tcp)
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:46.845908 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:46.846526 [DEBUG] consul: Skipping self join check for "Node 4c155c66-fba5-cf3e-c44a-03835e7f2a29" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.846577 [INFO] agent: Stopping DNS server 127.0.0.1:34307 (udp)
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:46.846707 [INFO] consul: member 'Node 4c155c66-fba5-cf3e-c44a-03835e7f2a29' joined, marking health alive
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.846748 [INFO] agent: Stopping HTTP server 127.0.0.1:34308 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.846992 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.847063 [INFO] agent: Endpoints down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.847106 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.847156 [INFO] consul: shutting down server
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:46.847199 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:46.848762 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:47.014430 [INFO] manager: shutting down
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:47.015254 [INFO] agent: consul server down
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:47.015326 [INFO] agent: shutdown complete
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:47.015395 [INFO] agent: Stopping DNS server 127.0.0.1:34313 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:47.015568 [INFO] agent: Stopping DNS server 127.0.0.1:34313 (udp)
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:47.015738 [INFO] agent: Stopping HTTP server 127.0.0.1:34314 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:47.015956 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks_DistanceSort - 2019/12/06 06:03:47.016033 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks_DistanceSort (7.24s)
=== CONT  TestHealthServiceChecks
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.018081 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks - 2019/12/06 06:03:47.103594 [WARN] agent: Node name "Node 4fd3506a-c7a4-a189-3840-447b3f0da4b5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks - 2019/12/06 06:03:47.104088 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks - 2019/12/06 06:03:47.106704 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.178816 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.178920 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.180085 [INFO] agent: consul server down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.180156 [INFO] agent: shutdown complete
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.180217 [INFO] agent: Stopping DNS server 127.0.0.1:34283 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.180368 [INFO] agent: Stopping DNS server 127.0.0.1:34283 (udp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.180536 [INFO] agent: Stopping HTTP server 127.0.0.1:34284 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.180749 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_WanTranslation - 2019/12/06 06:03:47.180822 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_WanTranslation (13.35s)
=== CONT  TestHealthNodeChecks_Filtering
WARNING: bootstrap = true: do not enable unless necessary
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:47.274453 [WARN] agent: Node name "Node fc92e9c6-65cc-22c8-60c2-0230feff2661" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:47.274954 [DEBUG] tlsutil: Update with version 1
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:47.277609 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.343775 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.410929 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.411486 [DEBUG] consul: Skipping self join check for "Node e4f02f99-6fb6-5cd9-f923-2f93c9256b75" since the cluster is too small
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.411659 [INFO] consul: member 'Node e4f02f99-6fb6-5cd9-f923-2f93c9256b75' joined, marking health alive
jones - 2019/12/06 06:03:47.486174 [DEBUG] consul: Skipping self join check for "Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4" since the cluster is too small
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.501890 [INFO] agent: Requesting shutdown
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.502126 [INFO] consul: shutting down server
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.502427 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.660033 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.712775 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.712844 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.712906 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.735157 [INFO] manager: shutting down
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.736050 [INFO] agent: consul server down
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.736118 [INFO] agent: shutdown complete
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.736182 [INFO] agent: Stopping DNS server 127.0.0.1:34319 (tcp)
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.736405 [INFO] agent: Stopping DNS server 127.0.0.1:34319 (udp)
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.736575 [INFO] agent: Stopping HTTP server 127.0.0.1:34320 (tcp)
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.736775 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks_Filtering - 2019/12/06 06:03:47.736842 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks_Filtering (7.10s)
=== CONT  TestAgent_GetCoordinate
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.813804 [INFO] agent: Requesting shutdown
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.813903 [INFO] consul: shutting down server
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.813950 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_GetCoordinate - 2019/12/06 06:03:47.834701 [WARN] agent: Node name "Node 3bada048-6621-1f3b-2e37-066ee09fd889" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_GetCoordinate - 2019/12/06 06:03:47.835211 [DEBUG] tlsutil: Update with version 1
TestAgent_GetCoordinate - 2019/12/06 06:03:47.837431 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:47.901752 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:48.068552 [INFO] manager: shutting down
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:48.069231 [INFO] agent: consul server down
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:48.069305 [INFO] agent: shutdown complete
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:48.069364 [INFO] agent: Stopping DNS server 127.0.0.1:34325 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:48.069543 [INFO] agent: Stopping DNS server 127.0.0.1:34325 (udp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:48.069722 [INFO] agent: Stopping HTTP server 127.0.0.1:34326 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:48.069947 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks_NodeMetaFilter - 2019/12/06 06:03:48.070039 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks_NodeMetaFilter (5.25s)
=== CONT  TestHealthNodeChecks
WARNING: bootstrap = true: do not enable unless necessary
TestHealthNodeChecks - 2019/12/06 06:03:48.231133 [WARN] agent: Node name "Node c1d51db3-7cc8-8006-ae51-cac45ee6c3bd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthNodeChecks - 2019/12/06 06:03:48.233501 [DEBUG] tlsutil: Update with version 1
TestHealthNodeChecks - 2019/12/06 06:03:48.236151 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4fd3506a-c7a4-a189-3840-447b3f0da4b5 Address:127.0.0.1:34336}]
2019/12/06 06:03:48 [INFO]  raft: Node at 127.0.0.1:34336 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks - 2019/12/06 06:03:48.381574 [INFO] serf: EventMemberJoin: Node 4fd3506a-c7a4-a189-3840-447b3f0da4b5.dc1 127.0.0.1
TestHealthServiceChecks - 2019/12/06 06:03:48.385092 [INFO] serf: EventMemberJoin: Node 4fd3506a-c7a4-a189-3840-447b3f0da4b5 127.0.0.1
TestHealthServiceChecks - 2019/12/06 06:03:48.386728 [INFO] consul: Adding LAN server Node 4fd3506a-c7a4-a189-3840-447b3f0da4b5 (Addr: tcp/127.0.0.1:34336) (DC: dc1)
TestHealthServiceChecks - 2019/12/06 06:03:48.386966 [INFO] consul: Handled member-join event for server "Node 4fd3506a-c7a4-a189-3840-447b3f0da4b5.dc1" in area "wan"
TestHealthServiceChecks - 2019/12/06 06:03:48.389677 [INFO] agent: Started DNS server 127.0.0.1:34331 (udp)
TestHealthServiceChecks - 2019/12/06 06:03:48.390853 [INFO] agent: Started DNS server 127.0.0.1:34331 (tcp)
TestHealthServiceChecks - 2019/12/06 06:03:48.396225 [INFO] agent: Started HTTP server on 127.0.0.1:34332 (tcp)
TestHealthServiceChecks - 2019/12/06 06:03:48.396509 [INFO] agent: started state syncer
2019/12/06 06:03:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:48 [INFO]  raft: Node at 127.0.0.1:34336 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fc92e9c6-65cc-22c8-60c2-0230feff2661 Address:127.0.0.1:34342}]
2019/12/06 06:03:49 [INFO]  raft: Node at 127.0.0.1:34342 [Follower] entering Follower state (Leader: "")
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.020636 [INFO] serf: EventMemberJoin: Node fc92e9c6-65cc-22c8-60c2-0230feff2661.dc1 127.0.0.1
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.024098 [INFO] serf: EventMemberJoin: Node fc92e9c6-65cc-22c8-60c2-0230feff2661 127.0.0.1
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.025504 [INFO] consul: Handled member-join event for server "Node fc92e9c6-65cc-22c8-60c2-0230feff2661.dc1" in area "wan"
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.025611 [INFO] consul: Adding LAN server Node fc92e9c6-65cc-22c8-60c2-0230feff2661 (Addr: tcp/127.0.0.1:34342) (DC: dc1)
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.025815 [INFO] agent: Started DNS server 127.0.0.1:34337 (udp)
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.026065 [INFO] agent: Started DNS server 127.0.0.1:34337 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.028556 [INFO] agent: Started HTTP server on 127.0.0.1:34338 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.028692 [INFO] agent: started state syncer
2019/12/06 06:03:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:49 [INFO]  raft: Node at 127.0.0.1:34342 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3bada048-6621-1f3b-2e37-066ee09fd889 Address:127.0.0.1:34348}]
2019/12/06 06:03:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:49 [INFO]  raft: Node at 127.0.0.1:34336 [Leader] entering Leader state
2019/12/06 06:03:49 [INFO]  raft: Node at 127.0.0.1:34348 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks - 2019/12/06 06:03:49.842636 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks - 2019/12/06 06:03:49.843011 [INFO] consul: New leader elected: Node 4fd3506a-c7a4-a189-3840-447b3f0da4b5
TestAgent_GetCoordinate - 2019/12/06 06:03:49.839160 [INFO] serf: EventMemberJoin: Node 3bada048-6621-1f3b-2e37-066ee09fd889.dc1 127.0.0.1
TestAgent_GetCoordinate - 2019/12/06 06:03:49.853014 [INFO] serf: EventMemberJoin: Node 3bada048-6621-1f3b-2e37-066ee09fd889 127.0.0.1
TestAgent_GetCoordinate - 2019/12/06 06:03:49.854420 [INFO] consul: Adding LAN server Node 3bada048-6621-1f3b-2e37-066ee09fd889 (Addr: tcp/127.0.0.1:34348) (DC: dc1)
TestAgent_GetCoordinate - 2019/12/06 06:03:49.855699 [INFO] consul: Handled member-join event for server "Node 3bada048-6621-1f3b-2e37-066ee09fd889.dc1" in area "wan"
TestAgent_GetCoordinate - 2019/12/06 06:03:49.859541 [INFO] agent: Started DNS server 127.0.0.1:34343 (tcp)
TestAgent_GetCoordinate - 2019/12/06 06:03:49.859868 [INFO] agent: Started DNS server 127.0.0.1:34343 (udp)
TestAgent_GetCoordinate - 2019/12/06 06:03:49.867223 [INFO] agent: Started HTTP server on 127.0.0.1:34344 (tcp)
TestAgent_GetCoordinate - 2019/12/06 06:03:49.867527 [INFO] agent: started state syncer
2019/12/06 06:03:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:49 [INFO]  raft: Node at 127.0.0.1:34348 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:49 [INFO]  raft: Node at 127.0.0.1:34342 [Leader] entering Leader state
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.910428 [INFO] consul: cluster leadership acquired
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:49.910851 [INFO] consul: New leader elected: Node fc92e9c6-65cc-22c8-60c2-0230feff2661
2019/12/06 06:03:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c1d51db3-7cc8-8006-ae51-cac45ee6c3bd Address:127.0.0.1:34354}]
2019/12/06 06:03:50 [INFO]  raft: Node at 127.0.0.1:34354 [Follower] entering Follower state (Leader: "")
TestHealthNodeChecks - 2019/12/06 06:03:50.076971 [INFO] serf: EventMemberJoin: Node c1d51db3-7cc8-8006-ae51-cac45ee6c3bd.dc1 127.0.0.1
TestHealthNodeChecks - 2019/12/06 06:03:50.085058 [INFO] serf: EventMemberJoin: Node c1d51db3-7cc8-8006-ae51-cac45ee6c3bd 127.0.0.1
TestHealthNodeChecks - 2019/12/06 06:03:50.086554 [INFO] agent: Started DNS server 127.0.0.1:34349 (udp)
TestHealthNodeChecks - 2019/12/06 06:03:50.087117 [INFO] consul: Adding LAN server Node c1d51db3-7cc8-8006-ae51-cac45ee6c3bd (Addr: tcp/127.0.0.1:34354) (DC: dc1)
TestHealthNodeChecks - 2019/12/06 06:03:50.087408 [INFO] consul: Handled member-join event for server "Node c1d51db3-7cc8-8006-ae51-cac45ee6c3bd.dc1" in area "wan"
TestHealthNodeChecks - 2019/12/06 06:03:50.087950 [INFO] agent: Started DNS server 127.0.0.1:34349 (tcp)
TestHealthNodeChecks - 2019/12/06 06:03:50.093319 [INFO] agent: Started HTTP server on 127.0.0.1:34350 (tcp)
TestHealthNodeChecks - 2019/12/06 06:03:50.093441 [INFO] agent: started state syncer
2019/12/06 06:03:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:50 [INFO]  raft: Node at 127.0.0.1:34354 [Candidate] entering Candidate state in term 2
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:50.336973 [INFO] agent: Synced node info
2019/12/06 06:03:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:50 [INFO]  raft: Node at 127.0.0.1:34348 [Leader] entering Leader state
TestAgent_GetCoordinate - 2019/12/06 06:03:50.579270 [INFO] consul: cluster leadership acquired
TestAgent_GetCoordinate - 2019/12/06 06:03:50.579694 [INFO] consul: New leader elected: Node 3bada048-6621-1f3b-2e37-066ee09fd889
TestHealthServiceChecks - 2019/12/06 06:03:50.696632 [INFO] agent: Synced node info
TestAgent_GetCoordinate - 2019/12/06 06:03:50.770730 [INFO] agent: Requesting shutdown
TestAgent_GetCoordinate - 2019/12/06 06:03:50.770835 [INFO] consul: shutting down server
TestAgent_GetCoordinate - 2019/12/06 06:03:50.770885 [WARN] serf: Shutdown without a Leave
TestAgent_GetCoordinate - 2019/12/06 06:03:50.923713 [WARN] serf: Shutdown without a Leave
2019/12/06 06:03:51 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:51 [INFO]  raft: Node at 127.0.0.1:34354 [Leader] entering Leader state
TestAgent_GetCoordinate - 2019/12/06 06:03:51.030615 [INFO] agent: Synced node info
TestAgent_GetCoordinate - 2019/12/06 06:03:51.033307 [INFO] manager: shutting down
TestHealthNodeChecks - 2019/12/06 06:03:51.035506 [INFO] consul: cluster leadership acquired
TestHealthNodeChecks - 2019/12/06 06:03:51.035874 [INFO] consul: New leader elected: Node c1d51db3-7cc8-8006-ae51-cac45ee6c3bd
TestHealthServiceChecks - 2019/12/06 06:03:51.059156 [DEBUG] agent: Node info in sync
TestHealthServiceChecks - 2019/12/06 06:03:51.059313 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:03:51.118850 [DEBUG] consul: Skipping self join check for "Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e" since the cluster is too small
TestAgent_GetCoordinate - 2019/12/06 06:03:51.279490 [INFO] agent: consul server down
TestAgent_GetCoordinate - 2019/12/06 06:03:51.279565 [INFO] agent: shutdown complete
TestAgent_GetCoordinate - 2019/12/06 06:03:51.279625 [INFO] agent: Stopping DNS server 127.0.0.1:34343 (tcp)
TestAgent_GetCoordinate - 2019/12/06 06:03:51.279760 [INFO] agent: Stopping DNS server 127.0.0.1:34343 (udp)
TestAgent_GetCoordinate - 2019/12/06 06:03:51.279913 [INFO] agent: Stopping HTTP server 127.0.0.1:34344 (tcp)
TestAgent_GetCoordinate - 2019/12/06 06:03:51.280095 [INFO] agent: Waiting for endpoints to shut down
TestAgent_GetCoordinate - 2019/12/06 06:03:51.280157 [INFO] agent: Endpoints down
TestAgent_GetCoordinate - 2019/12/06 06:03:51.290650 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_GetCoordinate - 2019/12/06 06:03:51.290968 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_GetCoordinate - 2019/12/06 06:03:51.373209 [WARN] agent: Node name "Node 57a34851-fe83-ee48-2ee6-1c2861d2e924" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_GetCoordinate - 2019/12/06 06:03:51.373643 [DEBUG] tlsutil: Update with version 1
TestAgent_GetCoordinate - 2019/12/06 06:03:51.375797 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks - 2019/12/06 06:03:51.460848 [INFO] agent: Synced node info
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:51.610075 [DEBUG] agent: Node info in sync
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:51.610325 [DEBUG] agent: Node info in sync
TestHealthServiceChecks - 2019/12/06 06:03:51.803367 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:51.804723 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks - 2019/12/06 06:03:51.804757 [DEBUG] consul: Skipping self join check for "Node 4fd3506a-c7a4-a189-3840-447b3f0da4b5" since the cluster is too small
TestHealthServiceChecks - 2019/12/06 06:03:51.804903 [INFO] consul: member 'Node 4fd3506a-c7a4-a189-3840-447b3f0da4b5' joined, marking health alive
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:51.805100 [DEBUG] consul: Skipping self join check for "Node fc92e9c6-65cc-22c8-60c2-0230feff2661" since the cluster is too small
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:51.805231 [INFO] consul: member 'Node fc92e9c6-65cc-22c8-60c2-0230feff2661' joined, marking health alive
TestHealthServiceChecks - 2019/12/06 06:03:52.371085 [INFO] agent: Requesting shutdown
TestHealthServiceChecks - 2019/12/06 06:03:52.371172 [INFO] consul: shutting down server
TestHealthServiceChecks - 2019/12/06 06:03:52.371219 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks - 2019/12/06 06:03:52.476735 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks - 2019/12/06 06:03:52.543543 [INFO] manager: shutting down
TestHealthServiceChecks - 2019/12/06 06:03:52.544095 [INFO] agent: consul server down
TestHealthServiceChecks - 2019/12/06 06:03:52.544160 [INFO] agent: shutdown complete
TestHealthServiceChecks - 2019/12/06 06:03:52.544276 [INFO] agent: Stopping DNS server 127.0.0.1:34331 (tcp)
TestHealthServiceChecks - 2019/12/06 06:03:52.544448 [INFO] agent: Stopping DNS server 127.0.0.1:34331 (udp)
TestHealthServiceChecks - 2019/12/06 06:03:52.544656 [INFO] agent: Stopping HTTP server 127.0.0.1:34332 (tcp)
TestHealthServiceChecks - 2019/12/06 06:03:52.544918 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks - 2019/12/06 06:03:52.544997 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks (5.53s)
=== CONT  TestHealthChecksInState_DistanceSort
WARNING: bootstrap = true: do not enable unless necessary
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:52.610018 [WARN] agent: Node name "Node 5b2c502f-58a9-a076-2905-84243c72f3b6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:52.610597 [DEBUG] tlsutil: Update with version 1
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:52.614838 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.627228 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks - 2019/12/06 06:03:52.627740 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthNodeChecks - 2019/12/06 06:03:52.628189 [DEBUG] consul: Skipping self join check for "Node c1d51db3-7cc8-8006-ae51-cac45ee6c3bd" since the cluster is too small
TestHealthNodeChecks - 2019/12/06 06:03:52.628377 [INFO] consul: member 'Node c1d51db3-7cc8-8006-ae51-cac45ee6c3bd' joined, marking health alive
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.629989 [INFO] agent: Requesting shutdown
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.630070 [INFO] consul: shutting down server
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.630119 [WARN] serf: Shutdown without a Leave
2019/12/06 06:03:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:57a34851-fe83-ee48-2ee6-1c2861d2e924 Address:127.0.0.1:34360}]
2019/12/06 06:03:52 [INFO]  raft: Node at 127.0.0.1:34360 [Follower] entering Follower state (Leader: "")
TestAgent_GetCoordinate - 2019/12/06 06:03:52.635240 [INFO] serf: EventMemberJoin: Node 57a34851-fe83-ee48-2ee6-1c2861d2e924.dc1 127.0.0.1
TestAgent_GetCoordinate - 2019/12/06 06:03:52.638289 [INFO] serf: EventMemberJoin: Node 57a34851-fe83-ee48-2ee6-1c2861d2e924 127.0.0.1
TestAgent_GetCoordinate - 2019/12/06 06:03:52.638773 [INFO] consul: Handled member-join event for server "Node 57a34851-fe83-ee48-2ee6-1c2861d2e924.dc1" in area "wan"
TestAgent_GetCoordinate - 2019/12/06 06:03:52.639061 [INFO] consul: Adding LAN server Node 57a34851-fe83-ee48-2ee6-1c2861d2e924 (Addr: tcp/127.0.0.1:34360) (DC: dc1)
TestAgent_GetCoordinate - 2019/12/06 06:03:52.639455 [INFO] agent: Started DNS server 127.0.0.1:34355 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.640143 [WARN] consul: error getting server health from "Node fc92e9c6-65cc-22c8-60c2-0230feff2661": rpc error making call: EOF
TestAgent_GetCoordinate - 2019/12/06 06:03:52.640676 [INFO] agent: Started DNS server 127.0.0.1:34355 (udp)
TestAgent_GetCoordinate - 2019/12/06 06:03:52.643383 [INFO] agent: Started HTTP server on 127.0.0.1:34356 (tcp)
TestAgent_GetCoordinate - 2019/12/06 06:03:52.643478 [INFO] agent: started state syncer
2019/12/06 06:03:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:52 [INFO]  raft: Node at 127.0.0.1:34360 [Candidate] entering Candidate state in term 2
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.736142 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks - 2019/12/06 06:03:52.898154 [INFO] agent: Requesting shutdown
TestHealthNodeChecks - 2019/12/06 06:03:52.898290 [INFO] consul: shutting down server
TestHealthNodeChecks - 2019/12/06 06:03:52.898354 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.899133 [INFO] manager: shutting down
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.899607 [INFO] agent: consul server down
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.899656 [INFO] agent: shutdown complete
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.899709 [INFO] agent: Stopping DNS server 127.0.0.1:34337 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.899831 [INFO] agent: Stopping DNS server 127.0.0.1:34337 (udp)
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.899973 [INFO] agent: Stopping HTTP server 127.0.0.1:34338 (tcp)
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.900156 [INFO] agent: Waiting for endpoints to shut down
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:52.900228 [INFO] agent: Endpoints down
--- PASS: TestHealthNodeChecks_Filtering (5.72s)
=== CONT  TestHealthChecksInState_Filter
WARNING: bootstrap = true: do not enable unless necessary
TestHealthChecksInState_Filter - 2019/12/06 06:03:52.977652 [WARN] agent: Node name "Node a1559eda-016c-a378-9b47-db962eb7aa5e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthChecksInState_Filter - 2019/12/06 06:03:52.978149 [DEBUG] tlsutil: Update with version 1
TestHealthChecksInState_Filter - 2019/12/06 06:03:52.980446 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks - 2019/12/06 06:03:53.076779 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks - 2019/12/06 06:03:53.177001 [INFO] manager: shutting down
TestHealthNodeChecks - 2019/12/06 06:03:53.178082 [INFO] agent: consul server down
TestHealthNodeChecks - 2019/12/06 06:03:53.178140 [INFO] agent: shutdown complete
TestHealthNodeChecks - 2019/12/06 06:03:53.178193 [INFO] agent: Stopping DNS server 127.0.0.1:34349 (tcp)
TestHealthNodeChecks - 2019/12/06 06:03:53.178322 [INFO] agent: Stopping DNS server 127.0.0.1:34349 (udp)
TestHealthNodeChecks - 2019/12/06 06:03:53.178466 [INFO] agent: Stopping HTTP server 127.0.0.1:34350 (tcp)
TestHealthNodeChecks - 2019/12/06 06:03:53.178658 [INFO] agent: Waiting for endpoints to shut down
TestHealthNodeChecks - 2019/12/06 06:03:53.178724 [INFO] agent: Endpoints down
--- PASS: TestHealthNodeChecks (5.11s)
=== CONT  TestHealthChecksInState_NodeMetaFilter
WARNING: bootstrap = true: do not enable unless necessary
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:53.270940 [WARN] agent: Node name "Node 6b015a18-4b01-5d0c-7419-548750b683a9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:53.273689 [DEBUG] tlsutil: Update with version 1
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:53.282972 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:53 [INFO]  raft: Node at 127.0.0.1:34360 [Leader] entering Leader state
TestAgent_GetCoordinate - 2019/12/06 06:03:53.402360 [INFO] consul: cluster leadership acquired
TestAgent_GetCoordinate - 2019/12/06 06:03:53.402828 [INFO] consul: New leader elected: Node 57a34851-fe83-ee48-2ee6-1c2861d2e924
TestAgent_GetCoordinate - 2019/12/06 06:03:53.447350 [INFO] agent: Requesting shutdown
TestAgent_GetCoordinate - 2019/12/06 06:03:53.447452 [INFO] consul: shutting down server
TestAgent_GetCoordinate - 2019/12/06 06:03:53.447502 [WARN] serf: Shutdown without a Leave
TestAgent_GetCoordinate - 2019/12/06 06:03:53.447623 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_GetCoordinate - 2019/12/06 06:03:53.596505 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks_Filtering - 2019/12/06 06:03:53.627287 [WARN] consul: error getting server health from "Node fc92e9c6-65cc-22c8-60c2-0230feff2661": context deadline exceeded
TestAgent_GetCoordinate - 2019/12/06 06:03:53.803379 [INFO] manager: shutting down
jones - 2019/12/06 06:03:53.803894 [DEBUG] consul: Skipping self join check for "Node 05379951-a361-1a11-bcaf-fc25c0b4fc85" since the cluster is too small
TestAgent_GetCoordinate - 2019/12/06 06:03:53.889474 [INFO] agent: consul server down
TestAgent_GetCoordinate - 2019/12/06 06:03:53.889544 [INFO] agent: shutdown complete
TestAgent_GetCoordinate - 2019/12/06 06:03:53.889605 [INFO] agent: Stopping DNS server 127.0.0.1:34355 (tcp)
TestAgent_GetCoordinate - 2019/12/06 06:03:53.889740 [INFO] agent: Stopping DNS server 127.0.0.1:34355 (udp)
2019/12/06 06:03:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5b2c502f-58a9-a076-2905-84243c72f3b6 Address:127.0.0.1:34366}]
TestAgent_GetCoordinate - 2019/12/06 06:03:53.889890 [INFO] agent: Stopping HTTP server 127.0.0.1:34356 (tcp)
TestAgent_GetCoordinate - 2019/12/06 06:03:53.890110 [INFO] agent: Waiting for endpoints to shut down
TestAgent_GetCoordinate - 2019/12/06 06:03:53.890180 [INFO] agent: Endpoints down
--- PASS: TestAgent_GetCoordinate (6.15s)
=== CONT  TestUUIDToUint64
--- PASS: TestUUIDToUint64 (0.00s)
=== CONT  TestEventList_EventBufOrder
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:53.893187 [INFO] serf: EventMemberJoin: Node 5b2c502f-58a9-a076-2905-84243c72f3b6.dc1 127.0.0.1
TestAgent_GetCoordinate - 2019/12/06 06:03:53.894681 [ERR] consul: failed to wait for barrier: leadership lost while committing log
2019/12/06 06:03:53 [INFO]  raft: Node at 127.0.0.1:34366 [Follower] entering Follower state (Leader: "")
2019/12/06 06:03:53 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:53 [INFO]  raft: Node at 127.0.0.1:34366 [Candidate] entering Candidate state in term 2
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:53.961934 [INFO] serf: EventMemberJoin: Node 5b2c502f-58a9-a076-2905-84243c72f3b6 127.0.0.1
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:53.980854 [INFO] consul: Adding LAN server Node 5b2c502f-58a9-a076-2905-84243c72f3b6 (Addr: tcp/127.0.0.1:34366) (DC: dc1)
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:53.986147 [INFO] agent: Started DNS server 127.0.0.1:34361 (udp)
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:53.987945 [INFO] consul: Handled member-join event for server "Node 5b2c502f-58a9-a076-2905-84243c72f3b6.dc1" in area "wan"
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:53.993809 [INFO] agent: Started DNS server 127.0.0.1:34361 (tcp)
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:53.998063 [INFO] agent: Started HTTP server on 127.0.0.1:34362 (tcp)
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:53.998311 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_EventBufOrder - 2019/12/06 06:03:54.087744 [WARN] agent: Node name "Node 6f389020-a333-d64c-149b-ad56c3d6a2a0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_EventBufOrder - 2019/12/06 06:03:54.088242 [DEBUG] tlsutil: Update with version 1
TestEventList_EventBufOrder - 2019/12/06 06:03:54.090493 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a1559eda-016c-a378-9b47-db962eb7aa5e Address:127.0.0.1:34372}]
2019/12/06 06:03:54 [INFO]  raft: Node at 127.0.0.1:34372 [Follower] entering Follower state (Leader: "")
TestHealthChecksInState_Filter - 2019/12/06 06:03:54.211043 [INFO] serf: EventMemberJoin: Node a1559eda-016c-a378-9b47-db962eb7aa5e.dc1 127.0.0.1
TestHealthChecksInState_Filter - 2019/12/06 06:03:54.246437 [INFO] serf: EventMemberJoin: Node a1559eda-016c-a378-9b47-db962eb7aa5e 127.0.0.1
TestHealthChecksInState_Filter - 2019/12/06 06:03:54.248878 [INFO] consul: Adding LAN server Node a1559eda-016c-a378-9b47-db962eb7aa5e (Addr: tcp/127.0.0.1:34372) (DC: dc1)
TestHealthChecksInState_Filter - 2019/12/06 06:03:54.251576 [INFO] consul: Handled member-join event for server "Node a1559eda-016c-a378-9b47-db962eb7aa5e.dc1" in area "wan"
TestHealthChecksInState_Filter - 2019/12/06 06:03:54.254567 [INFO] agent: Started DNS server 127.0.0.1:34367 (tcp)
2019/12/06 06:03:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:54 [INFO]  raft: Node at 127.0.0.1:34372 [Candidate] entering Candidate state in term 2
TestHealthChecksInState_Filter - 2019/12/06 06:03:54.259269 [INFO] agent: Started DNS server 127.0.0.1:34367 (udp)
TestHealthChecksInState_Filter - 2019/12/06 06:03:54.262016 [INFO] agent: Started HTTP server on 127.0.0.1:34368 (tcp)
TestHealthChecksInState_Filter - 2019/12/06 06:03:54.262139 [INFO] agent: started state syncer
jones - 2019/12/06 06:03:54.604401 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:03:54.604480 [DEBUG] agent: Node info in sync
2019/12/06 06:03:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6b015a18-4b01-5d0c-7419-548750b683a9 Address:127.0.0.1:34378}]
2019/12/06 06:03:56 [INFO]  raft: Node at 127.0.0.1:34378 [Follower] entering Follower state (Leader: "")
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:56.083301 [INFO] serf: EventMemberJoin: Node 6b015a18-4b01-5d0c-7419-548750b683a9.dc1 127.0.0.1
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:56.091791 [INFO] serf: EventMemberJoin: Node 6b015a18-4b01-5d0c-7419-548750b683a9 127.0.0.1
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:56.093111 [INFO] consul: Adding LAN server Node 6b015a18-4b01-5d0c-7419-548750b683a9 (Addr: tcp/127.0.0.1:34378) (DC: dc1)
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:56.093398 [INFO] consul: Handled member-join event for server "Node 6b015a18-4b01-5d0c-7419-548750b683a9.dc1" in area "wan"
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:56.093852 [INFO] agent: Started DNS server 127.0.0.1:34373 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:56.093936 [INFO] agent: Started DNS server 127.0.0.1:34373 (udp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:56.096431 [INFO] agent: Started HTTP server on 127.0.0.1:34374 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:56.096546 [INFO] agent: started state syncer
2019/12/06 06:03:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:56 [INFO]  raft: Node at 127.0.0.1:34378 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:56 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:56 [INFO]  raft: Node at 127.0.0.1:34366 [Leader] entering Leader state
jones - 2019/12/06 06:03:56.277333 [DEBUG] consul: Skipping self join check for "Node 48766921-a98a-d876-447b-181b21701746" since the cluster is too small
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:56.277760 [INFO] consul: cluster leadership acquired
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:56.278240 [INFO] consul: New leader elected: Node 5b2c502f-58a9-a076-2905-84243c72f3b6
2019/12/06 06:03:56 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:56 [INFO]  raft: Node at 127.0.0.1:34372 [Leader] entering Leader state
TestHealthChecksInState_Filter - 2019/12/06 06:03:56.627459 [INFO] consul: cluster leadership acquired
TestHealthChecksInState_Filter - 2019/12/06 06:03:56.627902 [INFO] consul: New leader elected: Node a1559eda-016c-a378-9b47-db962eb7aa5e
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:56.825067 [INFO] agent: Synced node info
2019/12/06 06:03:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6f389020-a333-d64c-149b-ad56c3d6a2a0 Address:127.0.0.1:34384}]
2019/12/06 06:03:56 [INFO]  raft: Node at 127.0.0.1:34384 [Follower] entering Follower state (Leader: "")
TestEventList_EventBufOrder - 2019/12/06 06:03:56.833258 [INFO] serf: EventMemberJoin: Node 6f389020-a333-d64c-149b-ad56c3d6a2a0.dc1 127.0.0.1
TestEventList_EventBufOrder - 2019/12/06 06:03:56.836598 [INFO] serf: EventMemberJoin: Node 6f389020-a333-d64c-149b-ad56c3d6a2a0 127.0.0.1
TestEventList_EventBufOrder - 2019/12/06 06:03:56.838064 [INFO] agent: Started DNS server 127.0.0.1:34379 (tcp)
TestEventList_EventBufOrder - 2019/12/06 06:03:56.838264 [INFO] consul: Handled member-join event for server "Node 6f389020-a333-d64c-149b-ad56c3d6a2a0.dc1" in area "wan"
TestEventList_EventBufOrder - 2019/12/06 06:03:56.838290 [INFO] consul: Adding LAN server Node 6f389020-a333-d64c-149b-ad56c3d6a2a0 (Addr: tcp/127.0.0.1:34384) (DC: dc1)
TestEventList_EventBufOrder - 2019/12/06 06:03:56.844694 [INFO] agent: Started DNS server 127.0.0.1:34379 (udp)
TestEventList_EventBufOrder - 2019/12/06 06:03:56.847586 [INFO] agent: Started HTTP server on 127.0.0.1:34380 (tcp)
TestEventList_EventBufOrder - 2019/12/06 06:03:56.847701 [INFO] agent: started state syncer
2019/12/06 06:03:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:56 [INFO]  raft: Node at 127.0.0.1:34384 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:57 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:57 [INFO]  raft: Node at 127.0.0.1:34378 [Leader] entering Leader state
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.033426 [INFO] consul: cluster leadership acquired
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.033866 [INFO] consul: New leader elected: Node 6b015a18-4b01-5d0c-7419-548750b683a9
TestHealthChecksInState_Filter - 2019/12/06 06:03:57.151263 [INFO] agent: Synced node info
TestHealthChecksInState_Filter - 2019/12/06 06:03:57.151384 [DEBUG] agent: Node info in sync
TestHealthChecksInState_Filter - 2019/12/06 06:03:57.504687 [DEBUG] agent: Node info in sync
2019/12/06 06:03:57 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:57 [INFO]  raft: Node at 127.0.0.1:34384 [Leader] entering Leader state
TestEventList_EventBufOrder - 2019/12/06 06:03:57.581325 [INFO] consul: cluster leadership acquired
TestEventList_EventBufOrder - 2019/12/06 06:03:57.581933 [INFO] consul: New leader elected: Node 6f389020-a333-d64c-149b-ad56c3d6a2a0
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.656388 [INFO] agent: Synced node info
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.656542 [DEBUG] agent: Node info in sync
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.657782 [INFO] agent: Requesting shutdown
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.657878 [INFO] consul: shutting down server
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.658880 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.743553 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.811959 [DEBUG] agent: Node info in sync
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.822544 [INFO] manager: shutting down
TestHealthChecksInState_Filter - 2019/12/06 06:03:57.824152 [INFO] agent: Requesting shutdown
TestHealthChecksInState_Filter - 2019/12/06 06:03:57.824326 [INFO] consul: shutting down server
TestHealthChecksInState_Filter - 2019/12/06 06:03:57.824379 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.902296 [INFO] agent: consul server down
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.902393 [INFO] agent: shutdown complete
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.902474 [INFO] agent: Stopping DNS server 127.0.0.1:34373 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.902672 [INFO] agent: Stopping DNS server 127.0.0.1:34373 (udp)
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.902868 [INFO] agent: Stopping HTTP server 127.0.0.1:34374 (tcp)
TestEventList_EventBufOrder - 2019/12/06 06:03:57.903023 [INFO] agent: Synced node info
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.903126 [INFO] agent: Waiting for endpoints to shut down
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.903198 [INFO] agent: Endpoints down
--- PASS: TestHealthChecksInState_NodeMetaFilter (4.72s)
=== CONT  TestEventList_Blocking
TestHealthChecksInState_Filter - 2019/12/06 06:03:57.906705 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.906769 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.907102 [ERR] consul: failed to establish leadership: raft is already shutdown
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.907262 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestHealthChecksInState_NodeMetaFilter - 2019/12/06 06:03:57.907326 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_Blocking - 2019/12/06 06:03:57.962014 [WARN] agent: Node name "Node 02cf80d2-d43c-b227-60c5-790042b65e80" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_Blocking - 2019/12/06 06:03:57.962413 [DEBUG] tlsutil: Update with version 1
TestEventList_Blocking - 2019/12/06 06:03:57.964571 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.025654 [INFO] manager: shutting down
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.026437 [INFO] agent: consul server down
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.026565 [INFO] agent: shutdown complete
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.026651 [INFO] agent: Stopping DNS server 127.0.0.1:34367 (tcp)
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.026880 [INFO] agent: Stopping DNS server 127.0.0.1:34367 (udp)
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.027107 [INFO] agent: Stopping HTTP server 127.0.0.1:34368 (tcp)
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.027383 [INFO] agent: Waiting for endpoints to shut down
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.027472 [INFO] agent: Endpoints down
--- PASS: TestHealthChecksInState_Filter (5.13s)
=== CONT  TestEventList_ACLFilter
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.029713 [INFO] agent: Requesting shutdown
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.029818 [INFO] consul: shutting down server
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.029867 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.045662 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.050325 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.052636 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.052939 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestHealthChecksInState_Filter - 2019/12/06 06:03:58.053119 [ERR] consul: failed to transfer leadership in 3 attempts
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_ACLFilter - 2019/12/06 06:03:58.112820 [WARN] agent: Node name "Node f3b49bd4-e640-c384-83cd-e1708866be21" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_ACLFilter - 2019/12/06 06:03:58.113264 [DEBUG] tlsutil: Update with version 1
TestEventList_ACLFilter - 2019/12/06 06:03:58.115788 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.130032 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.324786 [INFO] manager: shutting down
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.628044 [ERR] connect: Apply failed leadership lost while committing log
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.628130 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.628366 [INFO] agent: consul server down
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.628416 [INFO] agent: shutdown complete
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.628470 [INFO] agent: Stopping DNS server 127.0.0.1:34361 (tcp)
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.628622 [INFO] agent: Stopping DNS server 127.0.0.1:34361 (udp)
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.628782 [INFO] agent: Stopping HTTP server 127.0.0.1:34362 (tcp)
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.628999 [INFO] agent: Waiting for endpoints to shut down
TestHealthChecksInState_DistanceSort - 2019/12/06 06:03:58.629086 [INFO] agent: Endpoints down
--- PASS: TestHealthChecksInState_DistanceSort (6.08s)
=== CONT  TestEventList_Filter
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_Filter - 2019/12/06 06:03:58.745590 [WARN] agent: Node name "Node 977415bf-24a4-a237-ce4d-39395970d064" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_Filter - 2019/12/06 06:03:58.746057 [DEBUG] tlsutil: Update with version 1
TestEventList_Filter - 2019/12/06 06:03:58.748202 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:02cf80d2-d43c-b227-60c5-790042b65e80 Address:127.0.0.1:34390}]
2019/12/06 06:03:59 [INFO]  raft: Node at 127.0.0.1:34390 [Follower] entering Follower state (Leader: "")
TestEventList_Blocking - 2019/12/06 06:03:59.326271 [INFO] serf: EventMemberJoin: Node 02cf80d2-d43c-b227-60c5-790042b65e80.dc1 127.0.0.1
TestEventList_Blocking - 2019/12/06 06:03:59.329979 [INFO] serf: EventMemberJoin: Node 02cf80d2-d43c-b227-60c5-790042b65e80 127.0.0.1
TestEventList_Blocking - 2019/12/06 06:03:59.330815 [INFO] consul: Handled member-join event for server "Node 02cf80d2-d43c-b227-60c5-790042b65e80.dc1" in area "wan"
TestEventList_Blocking - 2019/12/06 06:03:59.331118 [INFO] consul: Adding LAN server Node 02cf80d2-d43c-b227-60c5-790042b65e80 (Addr: tcp/127.0.0.1:34390) (DC: dc1)
TestEventList_Blocking - 2019/12/06 06:03:59.331509 [INFO] agent: Started DNS server 127.0.0.1:34385 (tcp)
TestEventList_Blocking - 2019/12/06 06:03:59.331806 [INFO] agent: Started DNS server 127.0.0.1:34385 (udp)
TestEventList_Blocking - 2019/12/06 06:03:59.334472 [INFO] agent: Started HTTP server on 127.0.0.1:34386 (tcp)
TestEventList_Blocking - 2019/12/06 06:03:59.334580 [INFO] agent: started state syncer
2019/12/06 06:03:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:59 [INFO]  raft: Node at 127.0.0.1:34390 [Candidate] entering Candidate state in term 2
TestEventList_EventBufOrder - 2019/12/06 06:03:59.452165 [DEBUG] agent: Node info in sync
TestEventList_EventBufOrder - 2019/12/06 06:03:59.452314 [DEBUG] agent: Node info in sync
TestEventList_EventBufOrder - 2019/12/06 06:03:59.453461 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_EventBufOrder - 2019/12/06 06:03:59.453977 [DEBUG] consul: Skipping self join check for "Node 6f389020-a333-d64c-149b-ad56c3d6a2a0" since the cluster is too small
TestEventList_EventBufOrder - 2019/12/06 06:03:59.454162 [INFO] consul: member 'Node 6f389020-a333-d64c-149b-ad56c3d6a2a0' joined, marking health alive
2019/12/06 06:03:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f3b49bd4-e640-c384-83cd-e1708866be21 Address:127.0.0.1:34396}]
2019/12/06 06:03:59 [INFO]  raft: Node at 127.0.0.1:34396 [Follower] entering Follower state (Leader: "")
TestEventList_ACLFilter - 2019/12/06 06:03:59.656545 [INFO] serf: EventMemberJoin: Node f3b49bd4-e640-c384-83cd-e1708866be21.dc1 127.0.0.1
TestEventList_ACLFilter - 2019/12/06 06:03:59.661798 [INFO] serf: EventMemberJoin: Node f3b49bd4-e640-c384-83cd-e1708866be21 127.0.0.1
TestEventList_ACLFilter - 2019/12/06 06:03:59.666003 [INFO] consul: Handled member-join event for server "Node f3b49bd4-e640-c384-83cd-e1708866be21.dc1" in area "wan"
TestEventList_ACLFilter - 2019/12/06 06:03:59.666116 [INFO] consul: Adding LAN server Node f3b49bd4-e640-c384-83cd-e1708866be21 (Addr: tcp/127.0.0.1:34396) (DC: dc1)
TestEventList_ACLFilter - 2019/12/06 06:03:59.666653 [INFO] agent: Started DNS server 127.0.0.1:34391 (udp)
TestEventList_ACLFilter - 2019/12/06 06:03:59.666734 [INFO] agent: Started DNS server 127.0.0.1:34391 (tcp)
TestEventList_ACLFilter - 2019/12/06 06:03:59.669084 [INFO] agent: Started HTTP server on 127.0.0.1:34392 (tcp)
TestEventList_ACLFilter - 2019/12/06 06:03:59.669413 [INFO] agent: started state syncer
2019/12/06 06:03:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:59 [INFO]  raft: Node at 127.0.0.1:34396 [Candidate] entering Candidate state in term 2
TestEventList_EventBufOrder - 2019/12/06 06:03:59.826869 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/12/06 06:03:59.827050 [DEBUG] agent: new event: foo (9c65ff92-c96c-b4b8-fb19-5e854154b47f)
TestEventList_EventBufOrder - 2019/12/06 06:03:59.827392 [DEBUG] consul: User event: bar
TestEventList_EventBufOrder - 2019/12/06 06:03:59.827456 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/12/06 06:03:59.827578 [DEBUG] agent: new event: bar (4f52ef71-7d92-1fba-40a4-a7b7d6d5c715)
TestEventList_EventBufOrder - 2019/12/06 06:03:59.827799 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/12/06 06:03:59.827854 [DEBUG] consul: User event: bar
TestEventList_EventBufOrder - 2019/12/06 06:03:59.828106 [DEBUG] agent: new event: foo (437f2d9b-7de0-fb7a-f757-a7d16b451364)
TestEventList_EventBufOrder - 2019/12/06 06:03:59.828233 [DEBUG] agent: new event: foo (729d8616-0538-c93d-8fb0-973e1a3f2804)
TestEventList_EventBufOrder - 2019/12/06 06:03:59.828332 [DEBUG] agent: new event: bar (a3c5cc3a-adb1-a9d7-86bb-6182c6babb97)
TestEventList_EventBufOrder - 2019/12/06 06:03:59.853358 [INFO] agent: Requesting shutdown
TestEventList_EventBufOrder - 2019/12/06 06:03:59.853476 [INFO] consul: shutting down server
TestEventList_EventBufOrder - 2019/12/06 06:03:59.853537 [WARN] serf: Shutdown without a Leave
TestEventList_EventBufOrder - 2019/12/06 06:04:00.060288 [WARN] serf: Shutdown without a Leave
TestEventList_EventBufOrder - 2019/12/06 06:04:00.193712 [INFO] manager: shutting down
2019/12/06 06:04:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:977415bf-24a4-a237-ce4d-39395970d064 Address:127.0.0.1:34402}]
TestEventList_EventBufOrder - 2019/12/06 06:04:00.194347 [INFO] agent: consul server down
TestEventList_EventBufOrder - 2019/12/06 06:04:00.194397 [INFO] agent: shutdown complete
TestEventList_EventBufOrder - 2019/12/06 06:04:00.194453 [INFO] agent: Stopping DNS server 127.0.0.1:34379 (tcp)
TestEventList_EventBufOrder - 2019/12/06 06:04:00.194634 [INFO] agent: Stopping DNS server 127.0.0.1:34379 (udp)
TestEventList_EventBufOrder - 2019/12/06 06:04:00.194845 [INFO] agent: Stopping HTTP server 127.0.0.1:34380 (tcp)
TestEventList_EventBufOrder - 2019/12/06 06:04:00.195100 [INFO] agent: Waiting for endpoints to shut down
TestEventList_EventBufOrder - 2019/12/06 06:04:00.195181 [INFO] agent: Endpoints down
--- PASS: TestEventList_EventBufOrder (6.30s)
=== CONT  TestEventList
2019/12/06 06:04:00 [INFO]  raft: Node at 127.0.0.1:34402 [Follower] entering Follower state (Leader: "")
TestEventList_Filter - 2019/12/06 06:04:00.198479 [INFO] serf: EventMemberJoin: Node 977415bf-24a4-a237-ce4d-39395970d064.dc1 127.0.0.1
2019/12/06 06:04:00 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:00 [INFO]  raft: Node at 127.0.0.1:34390 [Leader] entering Leader state
TestEventList_Blocking - 2019/12/06 06:04:00.200407 [INFO] consul: cluster leadership acquired
TestEventList_Blocking - 2019/12/06 06:04:00.200805 [INFO] consul: New leader elected: Node 02cf80d2-d43c-b227-60c5-790042b65e80
TestEventList_Filter - 2019/12/06 06:04:00.203637 [INFO] serf: EventMemberJoin: Node 977415bf-24a4-a237-ce4d-39395970d064 127.0.0.1
TestEventList_Filter - 2019/12/06 06:04:00.204472 [INFO] consul: Handled member-join event for server "Node 977415bf-24a4-a237-ce4d-39395970d064.dc1" in area "wan"
TestEventList_Filter - 2019/12/06 06:04:00.204800 [INFO] consul: Adding LAN server Node 977415bf-24a4-a237-ce4d-39395970d064 (Addr: tcp/127.0.0.1:34402) (DC: dc1)
TestEventList_Filter - 2019/12/06 06:04:00.210225 [INFO] agent: Started DNS server 127.0.0.1:34397 (udp)
TestEventList_Filter - 2019/12/06 06:04:00.210761 [INFO] agent: Started DNS server 127.0.0.1:34397 (tcp)
TestEventList_Filter - 2019/12/06 06:04:00.213075 [INFO] agent: Started HTTP server on 127.0.0.1:34398 (tcp)
TestEventList_Filter - 2019/12/06 06:04:00.213165 [INFO] agent: started state syncer
2019/12/06 06:04:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:00 [INFO]  raft: Node at 127.0.0.1:34402 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestEventList - 2019/12/06 06:04:00.284425 [WARN] agent: Node name "Node 2766dbc2-49b0-4522-46ee-8b57dad64eb2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList - 2019/12/06 06:04:00.284919 [DEBUG] tlsutil: Update with version 1
TestEventList - 2019/12/06 06:04:00.287063 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:00 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:00 [INFO]  raft: Node at 127.0.0.1:34396 [Leader] entering Leader state
TestEventList_ACLFilter - 2019/12/06 06:04:00.693977 [INFO] consul: cluster leadership acquired
TestEventList_ACLFilter - 2019/12/06 06:04:00.694506 [INFO] consul: New leader elected: Node f3b49bd4-e640-c384-83cd-e1708866be21
TestEventList_ACLFilter - 2019/12/06 06:04:00.765033 [ERR] agent: failed to sync remote state: ACL not found
TestEventList_Blocking - 2019/12/06 06:04:00.928084 [INFO] agent: Synced node info
2019/12/06 06:04:01 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:01 [INFO]  raft: Node at 127.0.0.1:34402 [Leader] entering Leader state
TestEventList_Filter - 2019/12/06 06:04:01.070561 [INFO] consul: cluster leadership acquired
TestEventList_Filter - 2019/12/06 06:04:01.073541 [INFO] consul: New leader elected: Node 977415bf-24a4-a237-ce4d-39395970d064
TestEventList_ACLFilter - 2019/12/06 06:04:01.193941 [INFO] acl: initializing acls
TestEventList_ACLFilter - 2019/12/06 06:04:01.217347 [INFO] acl: initializing acls
TestEventList_Blocking - 2019/12/06 06:04:01.410726 [DEBUG] agent: Node info in sync
TestEventList_Blocking - 2019/12/06 06:04:01.410859 [DEBUG] agent: Node info in sync
TestEventList_Filter - 2019/12/06 06:04:01.546124 [INFO] agent: Synced node info
TestEventList_Filter - 2019/12/06 06:04:01.546278 [DEBUG] agent: Node info in sync
2019/12/06 06:04:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2766dbc2-49b0-4522-46ee-8b57dad64eb2 Address:127.0.0.1:34408}]
2019/12/06 06:04:01 [INFO]  raft: Node at 127.0.0.1:34408 [Follower] entering Follower state (Leader: "")
TestEventList - 2019/12/06 06:04:01.555491 [INFO] serf: EventMemberJoin: Node 2766dbc2-49b0-4522-46ee-8b57dad64eb2.dc1 127.0.0.1
TestEventList - 2019/12/06 06:04:01.558773 [INFO] serf: EventMemberJoin: Node 2766dbc2-49b0-4522-46ee-8b57dad64eb2 127.0.0.1
TestEventList - 2019/12/06 06:04:01.559544 [INFO] consul: Handled member-join event for server "Node 2766dbc2-49b0-4522-46ee-8b57dad64eb2.dc1" in area "wan"
TestEventList - 2019/12/06 06:04:01.559865 [INFO] consul: Adding LAN server Node 2766dbc2-49b0-4522-46ee-8b57dad64eb2 (Addr: tcp/127.0.0.1:34408) (DC: dc1)
TestEventList - 2019/12/06 06:04:01.560127 [INFO] agent: Started DNS server 127.0.0.1:34403 (udp)
TestEventList - 2019/12/06 06:04:01.560322 [INFO] agent: Started DNS server 127.0.0.1:34403 (tcp)
TestEventList - 2019/12/06 06:04:01.562613 [INFO] agent: Started HTTP server on 127.0.0.1:34404 (tcp)
TestEventList - 2019/12/06 06:04:01.562717 [INFO] agent: started state syncer
2019/12/06 06:04:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:01 [INFO]  raft: Node at 127.0.0.1:34408 [Candidate] entering Candidate state in term 2
TestEventList_ACLFilter - 2019/12/06 06:04:01.711068 [INFO] consul: Created ACL 'global-management' policy
TestEventList_ACLFilter - 2019/12/06 06:04:01.711164 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList_ACLFilter - 2019/12/06 06:04:01.711441 [INFO] consul: Created ACL 'global-management' policy
TestEventList_ACLFilter - 2019/12/06 06:04:01.711500 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList_ACLFilter - 2019/12/06 06:04:02.328587 [INFO] consul: Bootstrapped ACL master token from configuration
TestEventList_ACLFilter - 2019/12/06 06:04:02.329934 [INFO] consul: Bootstrapped ACL master token from configuration
2019/12/06 06:04:02 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:02 [INFO]  raft: Node at 127.0.0.1:34408 [Leader] entering Leader state
TestEventList - 2019/12/06 06:04:02.466698 [INFO] consul: cluster leadership acquired
TestEventList - 2019/12/06 06:04:02.467393 [INFO] consul: New leader elected: Node 2766dbc2-49b0-4522-46ee-8b57dad64eb2
TestEventList_Blocking - 2019/12/06 06:04:02.600340 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_Blocking - 2019/12/06 06:04:02.600787 [DEBUG] consul: Skipping self join check for "Node 02cf80d2-d43c-b227-60c5-790042b65e80" since the cluster is too small
TestEventList_Blocking - 2019/12/06 06:04:02.600959 [INFO] consul: member 'Node 02cf80d2-d43c-b227-60c5-790042b65e80' joined, marking health alive
TestEventList_ACLFilter - 2019/12/06 06:04:02.795254 [INFO] consul: Created ACL anonymous token from configuration
TestEventList_ACLFilter - 2019/12/06 06:04:02.795371 [DEBUG] acl: transitioning out of legacy ACL mode
TestEventList_ACLFilter - 2019/12/06 06:04:02.795999 [INFO] consul: Created ACL anonymous token from configuration
TestEventList_ACLFilter - 2019/12/06 06:04:02.796379 [INFO] serf: EventMemberUpdate: Node f3b49bd4-e640-c384-83cd-e1708866be21
TestEventList_ACLFilter - 2019/12/06 06:04:02.797133 [INFO] serf: EventMemberUpdate: Node f3b49bd4-e640-c384-83cd-e1708866be21.dc1
TestEventList_ACLFilter - 2019/12/06 06:04:02.798032 [INFO] serf: EventMemberUpdate: Node f3b49bd4-e640-c384-83cd-e1708866be21
TestEventList_ACLFilter - 2019/12/06 06:04:02.798657 [INFO] serf: EventMemberUpdate: Node f3b49bd4-e640-c384-83cd-e1708866be21.dc1
TestEventList - 2019/12/06 06:04:02.927878 [INFO] agent: Synced node info
TestEventList_Blocking - 2019/12/06 06:04:02.941884 [DEBUG] consul: User event: test
TestEventList_Blocking - 2019/12/06 06:04:02.942049 [DEBUG] agent: new event: test (10547ff7-83b9-1544-63e1-7621b88fff03)
TestEventList_Blocking - 2019/12/06 06:04:02.992794 [DEBUG] consul: User event: second
TestEventList_Blocking - 2019/12/06 06:04:02.992985 [DEBUG] agent: new event: second (597ef61e-4cac-2ead-75c2-dc980dd6de6e)
TestEventList_Blocking - 2019/12/06 06:04:02.993249 [INFO] agent: Requesting shutdown
TestEventList_Blocking - 2019/12/06 06:04:02.993328 [INFO] consul: shutting down server
TestEventList_Blocking - 2019/12/06 06:04:02.993378 [WARN] serf: Shutdown without a Leave
TestEventList_Filter - 2019/12/06 06:04:03.211075 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_Filter - 2019/12/06 06:04:03.211587 [DEBUG] consul: Skipping self join check for "Node 977415bf-24a4-a237-ce4d-39395970d064" since the cluster is too small
TestEventList_Filter - 2019/12/06 06:04:03.211742 [INFO] consul: member 'Node 977415bf-24a4-a237-ce4d-39395970d064' joined, marking health alive
TestEventList_Blocking - 2019/12/06 06:04:03.215592 [WARN] serf: Shutdown without a Leave
TestEventList_Filter - 2019/12/06 06:04:03.267456 [DEBUG] agent: Node info in sync
TestEventList_ACLFilter - 2019/12/06 06:04:03.345546 [INFO] agent: Synced node info
TestEventList_ACLFilter - 2019/12/06 06:04:03.345681 [DEBUG] agent: Node info in sync
TestEventList_ACLFilter - 2019/12/06 06:04:03.353395 [DEBUG] consul: dropping node "Node f3b49bd4-e640-c384-83cd-e1708866be21" from result due to ACLs
=== RUN   TestEventList_ACLFilter/no_token
=== RUN   TestEventList_ACLFilter/root_token
TestEventList_ACLFilter - 2019/12/06 06:04:03.355318 [DEBUG] consul: User event: foo
TestEventList_ACLFilter - 2019/12/06 06:04:03.355479 [DEBUG] agent: new event: foo (0c4ee45d-9d8c-2ed7-8452-932af5876c1d)
TestEventList_ACLFilter - 2019/12/06 06:04:03.381234 [INFO] agent: Requesting shutdown
TestEventList_ACLFilter - 2019/12/06 06:04:03.381345 [INFO] consul: shutting down server
TestEventList_ACLFilter - 2019/12/06 06:04:03.381416 [WARN] serf: Shutdown without a Leave
TestEventList_Blocking - 2019/12/06 06:04:03.489274 [INFO] manager: shutting down
TestEventList_Blocking - 2019/12/06 06:04:03.489772 [INFO] agent: consul server down
TestEventList_Blocking - 2019/12/06 06:04:03.489830 [INFO] agent: shutdown complete
TestEventList_Blocking - 2019/12/06 06:04:03.489889 [INFO] agent: Stopping DNS server 127.0.0.1:34385 (tcp)
TestEventList_Blocking - 2019/12/06 06:04:03.490034 [INFO] agent: Stopping DNS server 127.0.0.1:34385 (udp)
TestEventList_Blocking - 2019/12/06 06:04:03.490210 [INFO] agent: Stopping HTTP server 127.0.0.1:34386 (tcp)
TestEventList_Blocking - 2019/12/06 06:04:03.490441 [INFO] agent: Waiting for endpoints to shut down
TestEventList_Blocking - 2019/12/06 06:04:03.490512 [INFO] agent: Endpoints down
--- PASS: TestEventList_Blocking (5.59s)
=== CONT  TestEventFire_token
TestEventList_ACLFilter - 2019/12/06 06:04:03.606549 [WARN] serf: Shutdown without a Leave
TestEventList_Filter - 2019/12/06 06:04:03.656201 [DEBUG] consul: User event: test
TestEventList_Filter - 2019/12/06 06:04:03.656319 [DEBUG] consul: User event: foo
TestEventList_Filter - 2019/12/06 06:04:03.656550 [DEBUG] agent: new event: test (cdf856d4-7510-cbf2-28b4-64cc3355e83b)
TestEventList_Filter - 2019/12/06 06:04:03.656676 [DEBUG] agent: new event: foo (cd3965a5-841d-57b7-85dd-151146f6f652)
TestEventList_Filter - 2019/12/06 06:04:03.688855 [INFO] agent: Requesting shutdown
TestEventList_Filter - 2019/12/06 06:04:03.689114 [INFO] consul: shutting down server
TestEventList_Filter - 2019/12/06 06:04:03.689314 [WARN] serf: Shutdown without a Leave
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_ACLFilter - 2019/12/06 06:04:03.714507 [INFO] manager: shutting down
TestEventFire_token - 2019/12/06 06:04:03.718811 [WARN] agent: Node name "Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventFire_token - 2019/12/06 06:04:03.719578 [DEBUG] tlsutil: Update with version 1
TestEventFire_token - 2019/12/06 06:04:03.722055 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_Filter - 2019/12/06 06:04:03.804257 [WARN] serf: Shutdown without a Leave
TestEventList_Filter - 2019/12/06 06:04:04.002052 [INFO] manager: shutting down
TestEventList_ACLFilter - 2019/12/06 06:04:04.002318 [INFO] agent: consul server down
TestEventList_ACLFilter - 2019/12/06 06:04:04.002369 [INFO] agent: shutdown complete
TestEventList_ACLFilter - 2019/12/06 06:04:04.002430 [INFO] agent: Stopping DNS server 127.0.0.1:34391 (tcp)
TestEventList_Filter - 2019/12/06 06:04:04.002459 [INFO] agent: consul server down
TestEventList_Filter - 2019/12/06 06:04:04.002501 [INFO] agent: shutdown complete
TestEventList_Filter - 2019/12/06 06:04:04.002578 [INFO] agent: Stopping DNS server 127.0.0.1:34397 (tcp)
TestEventList_ACLFilter - 2019/12/06 06:04:04.002635 [INFO] agent: Stopping DNS server 127.0.0.1:34391 (udp)
TestEventList_Filter - 2019/12/06 06:04:04.002717 [INFO] agent: Stopping DNS server 127.0.0.1:34397 (udp)
TestEventList_Filter - 2019/12/06 06:04:04.002870 [INFO] agent: Stopping HTTP server 127.0.0.1:34398 (tcp)
TestEventList_ACLFilter - 2019/12/06 06:04:04.002870 [INFO] agent: Stopping HTTP server 127.0.0.1:34392 (tcp)
TestEventList_Filter - 2019/12/06 06:04:04.003098 [INFO] agent: Waiting for endpoints to shut down
TestEventList_Filter - 2019/12/06 06:04:04.003177 [INFO] agent: Endpoints down
--- PASS: TestEventList_Filter (5.37s)
TestEventList_ACLFilter - 2019/12/06 06:04:04.003292 [INFO] agent: Waiting for endpoints to shut down
=== CONT  TestEventFire
TestEventList_ACLFilter - 2019/12/06 06:04:04.003356 [INFO] agent: Endpoints down
--- PASS: TestEventList_ACLFilter (5.98s)
    --- PASS: TestEventList_ACLFilter/no_token (0.00s)
    --- PASS: TestEventList_ACLFilter/root_token (0.03s)
=== CONT  TestDNS_trimUDPResponse_NoTrim
TestEventList_ACLFilter - 2019/12/06 06:04:04.002953 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
--- PASS: TestDNS_trimUDPResponse_NoTrim (0.07s)
=== CONT  TestDNS_ReloadConfig_DuringQuery
WARNING: bootstrap = true: do not enable unless necessary
TestEventFire - 2019/12/06 06:04:04.089796 [WARN] agent: Node name "Node 782a9117-4f49-2c82-baae-9b5cea4ab169" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventFire - 2019/12/06 06:04:04.090238 [DEBUG] tlsutil: Update with version 1
TestEventFire - 2019/12/06 06:04:04.092533 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:04.152242 [WARN] agent: Node name "Node 5a650eeb-a5da-7076-2036-fbb97f36e94d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:04.152820 [DEBUG] tlsutil: Update with version 1
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:04.155501 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList - 2019/12/06 06:04:04.794565 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList - 2019/12/06 06:04:04.795092 [DEBUG] consul: Skipping self join check for "Node 2766dbc2-49b0-4522-46ee-8b57dad64eb2" since the cluster is too small
TestEventList - 2019/12/06 06:04:04.795247 [INFO] consul: member 'Node 2766dbc2-49b0-4522-46ee-8b57dad64eb2' joined, marking health alive
2019/12/06 06:04:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:56448cea-91ce-aa36-bb66-f089a4ab4dc0 Address:127.0.0.1:34414}]
2019/12/06 06:04:04 [INFO]  raft: Node at 127.0.0.1:34414 [Follower] entering Follower state (Leader: "")
TestEventFire_token - 2019/12/06 06:04:04.925469 [INFO] serf: EventMemberJoin: Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0.dc1 127.0.0.1
TestEventFire_token - 2019/12/06 06:04:04.932156 [INFO] serf: EventMemberJoin: Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0 127.0.0.1
TestEventFire_token - 2019/12/06 06:04:04.933833 [INFO] consul: Adding LAN server Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0 (Addr: tcp/127.0.0.1:34414) (DC: dc1)
TestEventFire_token - 2019/12/06 06:04:04.934046 [INFO] consul: Handled member-join event for server "Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0.dc1" in area "wan"
TestEventFire_token - 2019/12/06 06:04:04.935401 [INFO] agent: Started DNS server 127.0.0.1:34409 (tcp)
TestEventFire_token - 2019/12/06 06:04:04.935506 [INFO] agent: Started DNS server 127.0.0.1:34409 (udp)
TestEventFire_token - 2019/12/06 06:04:04.937825 [INFO] agent: Started HTTP server on 127.0.0.1:34410 (tcp)
TestEventFire_token - 2019/12/06 06:04:04.937919 [INFO] agent: started state syncer
2019/12/06 06:04:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:04 [INFO]  raft: Node at 127.0.0.1:34414 [Candidate] entering Candidate state in term 2
TestEventList - 2019/12/06 06:04:05.053831 [DEBUG] consul: User event: test
TestEventList - 2019/12/06 06:04:05.054016 [DEBUG] agent: new event: test (512899bb-8352-9df0-82ce-e90412339f62)
TestEventList - 2019/12/06 06:04:05.079418 [INFO] agent: Requesting shutdown
TestEventList - 2019/12/06 06:04:05.079524 [INFO] consul: shutting down server
TestEventList - 2019/12/06 06:04:05.079580 [WARN] serf: Shutdown without a Leave
TestEventList - 2019/12/06 06:04:05.178333 [WARN] serf: Shutdown without a Leave
2019/12/06 06:04:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:782a9117-4f49-2c82-baae-9b5cea4ab169 Address:127.0.0.1:34420}]
2019/12/06 06:04:05 [INFO]  raft: Node at 127.0.0.1:34420 [Follower] entering Follower state (Leader: "")
TestEventList - 2019/12/06 06:04:05.253936 [INFO] agent: consul server down
TestEventList - 2019/12/06 06:04:05.253993 [INFO] agent: shutdown complete
TestEventList - 2019/12/06 06:04:05.254048 [INFO] agent: Stopping DNS server 127.0.0.1:34403 (tcp)
TestEventList - 2019/12/06 06:04:05.254254 [INFO] agent: Stopping DNS server 127.0.0.1:34403 (udp)
TestEventList - 2019/12/06 06:04:05.254438 [INFO] agent: Stopping HTTP server 127.0.0.1:34404 (tcp)
TestEventList - 2019/12/06 06:04:05.254683 [INFO] agent: Waiting for endpoints to shut down
TestEventList - 2019/12/06 06:04:05.254751 [INFO] agent: Endpoints down
--- PASS: TestEventList (5.06s)
=== CONT  TestDNS_ConfigReload
TestEventList - 2019/12/06 06:04:05.253356 [INFO] manager: shutting down
TestEventFire - 2019/12/06 06:04:05.258578 [INFO] serf: EventMemberJoin: Node 782a9117-4f49-2c82-baae-9b5cea4ab169.dc1 127.0.0.1
TestEventFire - 2019/12/06 06:04:05.267320 [INFO] serf: EventMemberJoin: Node 782a9117-4f49-2c82-baae-9b5cea4ab169 127.0.0.1
TestEventFire - 2019/12/06 06:04:05.268455 [INFO] consul: Adding LAN server Node 782a9117-4f49-2c82-baae-9b5cea4ab169 (Addr: tcp/127.0.0.1:34420) (DC: dc1)
TestEventFire - 2019/12/06 06:04:05.278021 [INFO] consul: Handled member-join event for server "Node 782a9117-4f49-2c82-baae-9b5cea4ab169.dc1" in area "wan"
TestEventFire - 2019/12/06 06:04:05.278638 [INFO] agent: Started DNS server 127.0.0.1:34415 (tcp)
TestEventFire - 2019/12/06 06:04:05.278709 [INFO] agent: Started DNS server 127.0.0.1:34415 (udp)
TestEventFire - 2019/12/06 06:04:05.281114 [INFO] agent: Started HTTP server on 127.0.0.1:34416 (tcp)
TestEventFire - 2019/12/06 06:04:05.281208 [INFO] agent: started state syncer
2019/12/06 06:04:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:05 [INFO]  raft: Node at 127.0.0.1:34420 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ConfigReload - 2019/12/06 06:04:05.410070 [WARN] agent: Node name "Node efd31d9b-1b33-16f1-c1a0-fd0ec573432d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ConfigReload - 2019/12/06 06:04:05.410538 [DEBUG] tlsutil: Update with version 1
TestDNS_ConfigReload - 2019/12/06 06:04:05.412697 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5a650eeb-a5da-7076-2036-fbb97f36e94d Address:127.0.0.1:34426}]
2019/12/06 06:04:05 [INFO]  raft: Node at 127.0.0.1:34426 [Follower] entering Follower state (Leader: "")
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:05.548674 [INFO] serf: EventMemberJoin: Node 5a650eeb-a5da-7076-2036-fbb97f36e94d.dc1 127.0.0.1
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:05.554090 [INFO] serf: EventMemberJoin: Node 5a650eeb-a5da-7076-2036-fbb97f36e94d 127.0.0.1
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:05.554931 [INFO] consul: Handled member-join event for server "Node 5a650eeb-a5da-7076-2036-fbb97f36e94d.dc1" in area "wan"
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:05.555314 [INFO] consul: Adding LAN server Node 5a650eeb-a5da-7076-2036-fbb97f36e94d (Addr: tcp/127.0.0.1:34426) (DC: dc1)
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:05.555361 [INFO] agent: Started DNS server 127.0.0.1:34421 (udp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:05.555942 [INFO] agent: Started DNS server 127.0.0.1:34421 (tcp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:05.558191 [INFO] agent: Started HTTP server on 127.0.0.1:34422 (tcp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:05.558284 [INFO] agent: started state syncer
2019/12/06 06:04:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:05 [INFO]  raft: Node at 127.0.0.1:34426 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:05 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:05 [INFO]  raft: Node at 127.0.0.1:34414 [Leader] entering Leader state
TestEventFire_token - 2019/12/06 06:04:05.686392 [INFO] consul: cluster leadership acquired
TestEventFire_token - 2019/12/06 06:04:05.687007 [INFO] consul: New leader elected: Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0
TestEventFire_token - 2019/12/06 06:04:05.859664 [ERR] agent: failed to sync remote state: ACL not found
2019/12/06 06:04:06 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:06 [INFO]  raft: Node at 127.0.0.1:34420 [Leader] entering Leader state
TestEventFire - 2019/12/06 06:04:06.169085 [INFO] consul: cluster leadership acquired
TestEventFire - 2019/12/06 06:04:06.169695 [INFO] consul: New leader elected: Node 782a9117-4f49-2c82-baae-9b5cea4ab169
TestEventFire_token - 2019/12/06 06:04:06.278650 [INFO] acl: initializing acls
2019/12/06 06:04:06 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:06 [INFO]  raft: Node at 127.0.0.1:34426 [Leader] entering Leader state
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.378452 [INFO] consul: cluster leadership acquired
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.378896 [INFO] consul: New leader elected: Node 5a650eeb-a5da-7076-2036-fbb97f36e94d
TestEventFire_token - 2019/12/06 06:04:06.469616 [INFO] consul: Created ACL 'global-management' policy
TestEventFire_token - 2019/12/06 06:04:06.469707 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventFire_token - 2019/12/06 06:04:06.485475 [INFO] acl: initializing acls
TestEventFire_token - 2019/12/06 06:04:06.485790 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventFire - 2019/12/06 06:04:06.586648 [INFO] agent: Synced node info
2019/12/06 06:04:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:efd31d9b-1b33-16f1-c1a0-fd0ec573432d Address:127.0.0.1:34432}]
TestEventFire_token - 2019/12/06 06:04:06.705234 [INFO] consul: Bootstrapped ACL master token from configuration
2019/12/06 06:04:06 [INFO]  raft: Node at 127.0.0.1:34432 [Follower] entering Follower state (Leader: "")
TestDNS_ConfigReload - 2019/12/06 06:04:06.706845 [INFO] serf: EventMemberJoin: Node efd31d9b-1b33-16f1-c1a0-fd0ec573432d.dc1 127.0.0.1
TestDNS_ConfigReload - 2019/12/06 06:04:06.711078 [INFO] serf: EventMemberJoin: Node efd31d9b-1b33-16f1-c1a0-fd0ec573432d 127.0.0.1
TestDNS_ConfigReload - 2019/12/06 06:04:06.711956 [INFO] consul: Adding LAN server Node efd31d9b-1b33-16f1-c1a0-fd0ec573432d (Addr: tcp/127.0.0.1:34432) (DC: dc1)
TestDNS_ConfigReload - 2019/12/06 06:04:06.712546 [INFO] consul: Handled member-join event for server "Node efd31d9b-1b33-16f1-c1a0-fd0ec573432d.dc1" in area "wan"
TestDNS_ConfigReload - 2019/12/06 06:04:06.713246 [DEBUG] dns: recursor enabled
TestDNS_ConfigReload - 2019/12/06 06:04:06.713278 [DEBUG] dns: recursor enabled
TestDNS_ConfigReload - 2019/12/06 06:04:06.713618 [INFO] agent: Started DNS server 127.0.0.1:34427 (tcp)
TestDNS_ConfigReload - 2019/12/06 06:04:06.714262 [INFO] agent: Started DNS server 127.0.0.1:34427 (udp)
TestDNS_ConfigReload - 2019/12/06 06:04:06.716640 [INFO] agent: Started HTTP server on 127.0.0.1:34428 (tcp)
TestDNS_ConfigReload - 2019/12/06 06:04:06.716748 [INFO] agent: started state syncer
2019/12/06 06:04:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:06 [INFO]  raft: Node at 127.0.0.1:34432 [Candidate] entering Candidate state in term 2
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.828257 [INFO] agent: Synced node info
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.837634 [WARN] consul: endpoint injected; this should only be used for testing
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.888763 [DEBUG] tlsutil: Update with version 2
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.940551 [DEBUG] dns: request for name nope.query.consul. type A class IN (took 101.074342ms) from client 127.0.0.1:56147 (udp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.944792 [INFO] agent: Requesting shutdown
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.944918 [INFO] consul: shutting down server
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:06.944994 [WARN] serf: Shutdown without a Leave
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.061479 [WARN] serf: Shutdown without a Leave
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.169775 [INFO] manager: shutting down
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.170297 [INFO] agent: consul server down
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.170361 [INFO] agent: shutdown complete
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.170420 [INFO] agent: Stopping DNS server 127.0.0.1:34421 (tcp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.170634 [INFO] agent: Stopping DNS server 127.0.0.1:34421 (udp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.170837 [INFO] agent: Stopping HTTP server 127.0.0.1:34422 (tcp)
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.171085 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.171166 [INFO] agent: Endpoints down
--- PASS: TestDNS_ReloadConfig_DuringQuery (3.10s)
=== CONT  TestDNS_Compression_Recurse
TestEventFire_token - 2019/12/06 06:04:07.172494 [INFO] consul: Bootstrapped ACL master token from configuration
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.173070 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_ReloadConfig_DuringQuery - 2019/12/06 06:04:07.173458 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_Recurse - 2019/12/06 06:04:07.283657 [WARN] agent: Node name "Node ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_Recurse - 2019/12/06 06:04:07.284455 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_Recurse - 2019/12/06 06:04:07.286995 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:07 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:07 [INFO]  raft: Node at 127.0.0.1:34432 [Leader] entering Leader state
TestDNS_ConfigReload - 2019/12/06 06:04:07.486991 [INFO] consul: cluster leadership acquired
TestDNS_ConfigReload - 2019/12/06 06:04:07.487387 [INFO] consul: New leader elected: Node efd31d9b-1b33-16f1-c1a0-fd0ec573432d
TestEventFire_token - 2019/12/06 06:04:07.613331 [INFO] consul: Created ACL anonymous token from configuration
TestEventFire_token - 2019/12/06 06:04:07.614241 [INFO] serf: EventMemberUpdate: Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0
TestEventFire_token - 2019/12/06 06:04:07.614785 [INFO] agent: Synced node info
TestEventFire_token - 2019/12/06 06:04:07.614878 [INFO] serf: EventMemberUpdate: Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0.dc1
TestEventFire_token - 2019/12/06 06:04:07.614924 [INFO] consul: Created ACL anonymous token from configuration
TestEventFire_token - 2019/12/06 06:04:07.614967 [DEBUG] acl: transitioning out of legacy ACL mode
TestEventFire_token - 2019/12/06 06:04:07.615641 [INFO] serf: EventMemberUpdate: Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0
TestEventFire_token - 2019/12/06 06:04:07.616325 [INFO] serf: EventMemberUpdate: Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0.dc1
TestEventFire_token - 2019/12/06 06:04:07.614899 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:04:07.632382 [DEBUG] consul: dropping node "Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0" from result due to ACLs
TestEventFire_token - 2019/12/06 06:04:07.633094 [DEBUG] consul: dropping node "Node 56448cea-91ce-aa36-bb66-f089a4ab4dc0" from result due to ACLs
TestEventFire - 2019/12/06 06:04:08.036498 [DEBUG] agent: Node info in sync
TestEventFire - 2019/12/06 06:04:08.036634 [DEBUG] agent: Node info in sync
TestEventFire - 2019/12/06 06:04:08.045056 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventFire - 2019/12/06 06:04:08.045577 [DEBUG] consul: Skipping self join check for "Node 782a9117-4f49-2c82-baae-9b5cea4ab169" since the cluster is too small
TestEventFire - 2019/12/06 06:04:08.045733 [INFO] consul: member 'Node 782a9117-4f49-2c82-baae-9b5cea4ab169' joined, marking health alive
TestDNS_ConfigReload - 2019/12/06 06:04:08.048842 [INFO] agent: Synced node info
TestDNS_ConfigReload - 2019/12/06 06:04:08.048962 [DEBUG] agent: Node info in sync
TestDNS_ConfigReload - 2019/12/06 06:04:08.070657 [DEBUG] tlsutil: Update with version 2
TestDNS_ConfigReload - 2019/12/06 06:04:08.072488 [INFO] agent: Requesting shutdown
TestDNS_ConfigReload - 2019/12/06 06:04:08.072574 [INFO] consul: shutting down server
TestDNS_ConfigReload - 2019/12/06 06:04:08.072625 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:04:08.138599 [WARN] consul: user event "foo" blocked by ACLs
TestEventFire_token - 2019/12/06 06:04:08.139327 [WARN] consul: user event "bar" blocked by ACLs
TestEventFire_token - 2019/12/06 06:04:08.140045 [INFO] agent: Requesting shutdown
TestEventFire_token - 2019/12/06 06:04:08.140125 [INFO] consul: shutting down server
TestEventFire_token - 2019/12/06 06:04:08.140174 [WARN] serf: Shutdown without a Leave
TestDNS_ConfigReload - 2019/12/06 06:04:08.243936 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:04:08.310387 [WARN] serf: Shutdown without a Leave
TestDNS_ConfigReload - 2019/12/06 06:04:08.313522 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_ConfigReload - 2019/12/06 06:04:08.313805 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_ConfigReload - 2019/12/06 06:04:08.314103 [INFO] manager: shutting down
TestDNS_ConfigReload - 2019/12/06 06:04:08.314505 [INFO] agent: consul server down
TestDNS_ConfigReload - 2019/12/06 06:04:08.314555 [INFO] agent: shutdown complete
TestDNS_ConfigReload - 2019/12/06 06:04:08.314605 [INFO] agent: Stopping DNS server 127.0.0.1:34427 (tcp)
TestDNS_ConfigReload - 2019/12/06 06:04:08.314730 [INFO] agent: Stopping DNS server 127.0.0.1:34427 (udp)
TestDNS_ConfigReload - 2019/12/06 06:04:08.314883 [INFO] agent: Stopping HTTP server 127.0.0.1:34428 (tcp)
TestDNS_ConfigReload - 2019/12/06 06:04:08.315912 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ConfigReload - 2019/12/06 06:04:08.315997 [INFO] agent: Endpoints down
--- PASS: TestDNS_ConfigReload (3.06s)
=== CONT  TestDNS_Compression_ReverseLookup
TestEventFire - 2019/12/06 06:04:08.324600 [INFO] agent: Requesting shutdown
TestEventFire - 2019/12/06 06:04:08.324890 [INFO] consul: shutting down server
TestEventFire - 2019/12/06 06:04:08.325032 [WARN] serf: Shutdown without a Leave
TestEventFire - 2019/12/06 06:04:08.393711 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:04:08.393843 [INFO] manager: shutting down
TestEventFire_token - 2019/12/06 06:04:08.394066 [INFO] agent: consul server down
TestEventFire_token - 2019/12/06 06:04:08.394113 [INFO] agent: shutdown complete
TestEventFire_token - 2019/12/06 06:04:08.394169 [INFO] agent: Stopping DNS server 127.0.0.1:34409 (tcp)
TestEventFire_token - 2019/12/06 06:04:08.394389 [INFO] agent: Stopping DNS server 127.0.0.1:34409 (udp)
TestEventFire_token - 2019/12/06 06:04:08.394571 [INFO] agent: Stopping HTTP server 127.0.0.1:34410 (tcp)
TestEventFire_token - 2019/12/06 06:04:08.394801 [INFO] agent: Waiting for endpoints to shut down
TestEventFire_token - 2019/12/06 06:04:08.394874 [INFO] agent: Endpoints down
--- PASS: TestEventFire_token (4.90s)
=== CONT  TestDNS_Compression_Query
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:08.413434 [WARN] agent: Node name "Node b2dd851e-837b-7c02-2f4e-6117033684c7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:08.413857 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:08.415956 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_Query - 2019/12/06 06:04:08.480364 [WARN] agent: Node name "Node d21fd331-f0c3-9273-d38c-c6e1e2d3fe23" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_Query - 2019/12/06 06:04:08.480927 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_Query - 2019/12/06 06:04:08.484000 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire - 2019/12/06 06:04:08.487425 [INFO] manager: shutting down
TestEventFire - 2019/12/06 06:04:08.487785 [INFO] agent: consul server down
TestEventFire - 2019/12/06 06:04:08.487839 [INFO] agent: shutdown complete
TestEventFire - 2019/12/06 06:04:08.487894 [INFO] agent: Stopping DNS server 127.0.0.1:34415 (tcp)
TestEventFire - 2019/12/06 06:04:08.488027 [INFO] agent: Stopping DNS server 127.0.0.1:34415 (udp)
TestEventFire - 2019/12/06 06:04:08.488177 [INFO] agent: Stopping HTTP server 127.0.0.1:34416 (tcp)
TestEventFire - 2019/12/06 06:04:08.488396 [INFO] agent: Waiting for endpoints to shut down
TestEventFire - 2019/12/06 06:04:08.488455 [INFO] agent: Endpoints down
--- PASS: TestEventFire (4.49s)
=== CONT  TestDNS_Compression_trimUDPResponse
=== CONT  TestDNS_syncExtra
--- PASS: TestDNS_syncExtra (0.00s)
=== CONT  TestDNS_trimUDPResponse_TrimSizeEDNS
--- PASS: TestDNS_Compression_trimUDPResponse (0.05s)
2019/12/06 06:04:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80 Address:127.0.0.1:34438}]
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.581383 [INFO] serf: EventMemberJoin: Node ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80.dc1 127.0.0.1
2019/12/06 06:04:08 [INFO]  raft: Node at 127.0.0.1:34438 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.600571 [INFO] serf: EventMemberJoin: Node ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80 127.0.0.1
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.602820 [INFO] consul: Adding LAN server Node ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80 (Addr: tcp/127.0.0.1:34438) (DC: dc1)
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.604236 [DEBUG] dns: recursor enabled
--- PASS: TestDNS_trimUDPResponse_TrimSizeEDNS (0.07s)
=== CONT  TestDNS_trimUDPResponse_TrimSize
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.604741 [INFO] consul: Handled member-join event for server "Node ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80.dc1" in area "wan"
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.604850 [DEBUG] dns: recursor enabled
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.607484 [INFO] agent: Started DNS server 127.0.0.1:34433 (tcp)
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.609321 [INFO] agent: Started DNS server 127.0.0.1:34433 (udp)
TestEventFire_token - 2019/12/06 06:04:08.617361 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.621490 [INFO] agent: Started HTTP server on 127.0.0.1:34434 (tcp)
TestDNS_Compression_Recurse - 2019/12/06 06:04:08.623803 [INFO] agent: started state syncer
2019/12/06 06:04:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:08 [INFO]  raft: Node at 127.0.0.1:34438 [Candidate] entering Candidate state in term 2
=== CONT  TestDNS_trimUDPResponse_TrimLimit
--- PASS: TestDNS_trimUDPResponse_TrimSize (0.08s)
=== CONT  TestDNS_PreparedQuery_AgentSource
--- PASS: TestDNS_trimUDPResponse_TrimLimit (0.08s)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:08.901645 [WARN] agent: Node name "Node f34f2ac3-3609-9c63-d098-b721ca8ca693" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:08.903212 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:08.909138 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:09 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:09 [INFO]  raft: Node at 127.0.0.1:34438 [Leader] entering Leader state
TestDNS_Compression_Recurse - 2019/12/06 06:04:09.312647 [INFO] consul: cluster leadership acquired
TestDNS_Compression_Recurse - 2019/12/06 06:04:09.313116 [INFO] consul: New leader elected: Node ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80
2019/12/06 06:04:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b2dd851e-837b-7c02-2f4e-6117033684c7 Address:127.0.0.1:34444}]
2019/12/06 06:04:09 [INFO]  raft: Node at 127.0.0.1:34444 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:09.513632 [INFO] serf: EventMemberJoin: Node b2dd851e-837b-7c02-2f4e-6117033684c7.dc1 127.0.0.1
2019/12/06 06:04:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:09 [INFO]  raft: Node at 127.0.0.1:34444 [Candidate] entering Candidate state in term 2
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:09.550045 [INFO] serf: EventMemberJoin: Node b2dd851e-837b-7c02-2f4e-6117033684c7 127.0.0.1
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:09.568814 [INFO] consul: Adding LAN server Node b2dd851e-837b-7c02-2f4e-6117033684c7 (Addr: tcp/127.0.0.1:34444) (DC: dc1)
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:09.570685 [INFO] consul: Handled member-join event for server "Node b2dd851e-837b-7c02-2f4e-6117033684c7.dc1" in area "wan"
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:09.573620 [INFO] agent: Started DNS server 127.0.0.1:34439 (tcp)
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:09.577074 [INFO] agent: Started DNS server 127.0.0.1:34439 (udp)
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:09.581611 [INFO] agent: Started HTTP server on 127.0.0.1:34440 (tcp)
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:09.581720 [INFO] agent: started state syncer
TestEventFire_token - 2019/12/06 06:04:09.617376 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d21fd331-f0c3-9273-d38c-c6e1e2d3fe23 Address:127.0.0.1:34450}]
2019/12/06 06:04:09 [INFO]  raft: Node at 127.0.0.1:34450 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_Query - 2019/12/06 06:04:09.673051 [INFO] serf: EventMemberJoin: Node d21fd331-f0c3-9273-d38c-c6e1e2d3fe23.dc1 127.0.0.1
TestDNS_Compression_Query - 2019/12/06 06:04:09.678737 [INFO] serf: EventMemberJoin: Node d21fd331-f0c3-9273-d38c-c6e1e2d3fe23 127.0.0.1
TestDNS_Compression_Query - 2019/12/06 06:04:09.680162 [INFO] consul: Adding LAN server Node d21fd331-f0c3-9273-d38c-c6e1e2d3fe23 (Addr: tcp/127.0.0.1:34450) (DC: dc1)
TestDNS_Compression_Query - 2019/12/06 06:04:09.680472 [INFO] consul: Handled member-join event for server "Node d21fd331-f0c3-9273-d38c-c6e1e2d3fe23.dc1" in area "wan"
TestDNS_Compression_Query - 2019/12/06 06:04:09.683412 [INFO] agent: Started DNS server 127.0.0.1:34445 (tcp)
TestDNS_Compression_Query - 2019/12/06 06:04:09.683829 [INFO] agent: Started DNS server 127.0.0.1:34445 (udp)
TestDNS_Compression_Query - 2019/12/06 06:04:09.686413 [INFO] agent: Started HTTP server on 127.0.0.1:34446 (tcp)
TestDNS_Compression_Query - 2019/12/06 06:04:09.686630 [INFO] agent: started state syncer
2019/12/06 06:04:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:09 [INFO]  raft: Node at 127.0.0.1:34450 [Candidate] entering Candidate state in term 2
TestDNS_Compression_Recurse - 2019/12/06 06:04:09.745170 [INFO] agent: Synced node info
TestDNS_Compression_Recurse - 2019/12/06 06:04:09.745294 [DEBUG] agent: Node info in sync
2019/12/06 06:04:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f34f2ac3-3609-9c63-d098-b721ca8ca693 Address:127.0.0.1:34456}]
2019/12/06 06:04:09 [INFO]  raft: Node at 127.0.0.1:34456 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:09.891582 [INFO] serf: EventMemberJoin: Node f34f2ac3-3609-9c63-d098-b721ca8ca693.dc1 127.0.0.1
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:09.895974 [INFO] serf: EventMemberJoin: Node f34f2ac3-3609-9c63-d098-b721ca8ca693 127.0.0.1
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:09.897237 [INFO] consul: Adding LAN server Node f34f2ac3-3609-9c63-d098-b721ca8ca693 (Addr: tcp/127.0.0.1:34456) (DC: dc1)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:09.897780 [INFO] consul: Handled member-join event for server "Node f34f2ac3-3609-9c63-d098-b721ca8ca693.dc1" in area "wan"
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:09.898826 [INFO] agent: Started DNS server 127.0.0.1:34451 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:09.904017 [INFO] agent: Started DNS server 127.0.0.1:34451 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:09.906646 [INFO] agent: Started HTTP server on 127.0.0.1:34452 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:09.906762 [INFO] agent: started state syncer
2019/12/06 06:04:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:09 [INFO]  raft: Node at 127.0.0.1:34456 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:10 [INFO]  raft: Node at 127.0.0.1:34444 [Leader] entering Leader state
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:10.154240 [INFO] consul: cluster leadership acquired
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:10.154759 [INFO] consul: New leader elected: Node b2dd851e-837b-7c02-2f4e-6117033684c7
2019/12/06 06:04:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:10 [INFO]  raft: Node at 127.0.0.1:34450 [Leader] entering Leader state
TestDNS_Compression_Query - 2019/12/06 06:04:10.361941 [INFO] consul: cluster leadership acquired
TestDNS_Compression_Query - 2019/12/06 06:04:10.362482 [INFO] consul: New leader elected: Node d21fd331-f0c3-9273-d38c-c6e1e2d3fe23
2019/12/06 06:04:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:10 [INFO]  raft: Node at 127.0.0.1:34456 [Leader] entering Leader state
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:10.529107 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:10.529612 [INFO] consul: New leader elected: Node f34f2ac3-3609-9c63-d098-b721ca8ca693
TestEventFire_token - 2019/12/06 06:04:10.617197 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:10.761425 [INFO] agent: Synced node info
TestDNS_Compression_Query - 2019/12/06 06:04:10.762495 [INFO] agent: Synced node info
TestDNS_Compression_Query - 2019/12/06 06:04:10.762599 [DEBUG] agent: Node info in sync
TestDNS_Compression_Query - 2019/12/06 06:04:10.815887 [DEBUG] agent: Node info in sync
TestDNS_Compression_Recurse - 2019/12/06 06:04:10.869814 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_Compression_Recurse - 2019/12/06 06:04:10.870372 [DEBUG] consul: Skipping self join check for "Node ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80" since the cluster is too small
TestDNS_Compression_Recurse - 2019/12/06 06:04:10.870551 [INFO] consul: member 'Node ca277ab7-0bcd-e3f7-70b5-0b78d1d36c80' joined, marking health alive
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.042195 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (653.682µs) Recursor queried: 127.0.0.1:51008
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.042530 [DEBUG] dns: request for {apple.com. 255 1} (udp) (1.517035ms) from client 127.0.0.1:40590 (udp)
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.044393 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (933.355µs) Recursor queried: 127.0.0.1:51008
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.044733 [DEBUG] dns: request for {apple.com. 255 1} (udp) (1.811375ms) from client 127.0.0.1:40590 (udp)
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.044855 [INFO] agent: Requesting shutdown
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.044938 [INFO] consul: shutting down server
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.044995 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.096479 [INFO] agent: Synced node info
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.097007 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (585.68µs) from client 127.0.0.1:33537 (udp)
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.097924 [INFO] agent: Requesting shutdown
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.098038 [INFO] consul: shutting down server
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.098105 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.097941 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (439.676µs) from client 127.0.0.1:33537 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.099789 [WARN] consul: endpoint injected; this should only be used for testing
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.101330 [DEBUG] dns: request for name foo.query.consul. type SRV class IN (took 498.345µs) from client 127.0.0.1:58713 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.101408 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.101488 [INFO] consul: shutting down server
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.101539 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.168685 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.235486 [INFO] manager: shutting down
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.235958 [INFO] agent: consul server down
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.236008 [INFO] agent: shutdown complete
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.236067 [INFO] agent: Stopping DNS server 127.0.0.1:34433 (tcp)
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.236290 [INFO] agent: Stopping DNS server 127.0.0.1:34433 (udp)
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.236481 [INFO] agent: Stopping HTTP server 127.0.0.1:34434 (tcp)
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.236696 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.236692 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Recurse - 2019/12/06 06:04:11.237079 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_Recurse (4.07s)
=== CONT  TestDNS_InvalidQueries
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.239422 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.310657 [INFO] manager: shutting down
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.310666 [INFO] manager: shutting down
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.311368 [INFO] agent: consul server down
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.311450 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.311512 [INFO] agent: Stopping DNS server 127.0.0.1:34451 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.311680 [INFO] agent: Stopping DNS server 127.0.0.1:34451 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.311861 [INFO] agent: Stopping HTTP server 127.0.0.1:34452 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.312073 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.312142 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_AgentSource (2.55s)
=== CONT  TestDNS_PreparedQuery_AllowStale
TestDNS_PreparedQuery_AgentSource - 2019/12/06 06:04:11.312302 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_InvalidQueries - 2019/12/06 06:04:11.372669 [WARN] agent: Node name "Node 67da43c6-7c91-1058-43b9-3e30bf667aab" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_InvalidQueries - 2019/12/06 06:04:11.373230 [DEBUG] tlsutil: Update with version 1
TestDNS_InvalidQueries - 2019/12/06 06:04:11.375802 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:11.461042 [WARN] agent: Node name "Node 40011720-c2e6-ee14-05f9-2df4d38ddc56" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:11.461686 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:11.464371 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.493818 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.494096 [INFO] agent: consul server down
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.494155 [INFO] agent: shutdown complete
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.494262 [INFO] agent: Stopping DNS server 127.0.0.1:34439 (tcp)
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.494421 [INFO] agent: Stopping DNS server 127.0.0.1:34439 (udp)
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.494594 [INFO] agent: Stopping HTTP server 127.0.0.1:34440 (tcp)
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.494812 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Compression_ReverseLookup - 2019/12/06 06:04:11.494889 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_ReverseLookup (3.18s)
=== CONT  TestDNS_AltDomains_Overlap
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:11.612840 [DEBUG] tlsutil: Update with version 1
TestEventFire_token - 2019/12/06 06:04:11.617250 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:11.618833 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_Query - 2019/12/06 06:04:11.840060 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 951.022µs) from client 127.0.0.1:35386 (udp)
TestDNS_Compression_Query - 2019/12/06 06:04:11.841297 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 740.017µs) from client 127.0.0.1:35386 (udp)
TestDNS_Compression_Query - 2019/12/06 06:04:11.842874 [DEBUG] dns: request for name e85294ab-e9a9-36b7-ee35-616bd5f0011d.query.consul. type SRV class IN (took 854.686µs) from client 127.0.0.1:58235 (udp)
TestDNS_Compression_Query - 2019/12/06 06:04:11.852146 [INFO] agent: Requesting shutdown
TestDNS_Compression_Query - 2019/12/06 06:04:11.852249 [INFO] consul: shutting down server
TestDNS_Compression_Query - 2019/12/06 06:04:11.852307 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Query - 2019/12/06 06:04:11.854107 [DEBUG] dns: request for name e85294ab-e9a9-36b7-ee35-616bd5f0011d.query.consul. type SRV class IN (took 8.776204ms) from client 127.0.0.1:58235 (udp)
TestDNS_Compression_Query - 2019/12/06 06:04:11.993824 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Query - 2019/12/06 06:04:12.060533 [INFO] manager: shutting down
TestDNS_Compression_Query - 2019/12/06 06:04:12.136045 [INFO] agent: consul server down
TestDNS_Compression_Query - 2019/12/06 06:04:12.136172 [INFO] agent: shutdown complete
TestDNS_Compression_Query - 2019/12/06 06:04:12.136236 [INFO] agent: Stopping DNS server 127.0.0.1:34445 (tcp)
TestDNS_Compression_Query - 2019/12/06 06:04:12.136430 [INFO] agent: Stopping DNS server 127.0.0.1:34445 (udp)
TestDNS_Compression_Query - 2019/12/06 06:04:12.136609 [INFO] agent: Stopping HTTP server 127.0.0.1:34446 (tcp)
TestDNS_Compression_Query - 2019/12/06 06:04:12.136848 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Compression_Query - 2019/12/06 06:04:12.136924 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_Query (3.74s)
=== CONT  TestDNS_AltDomains_SOA
TestDNS_Compression_Query - 2019/12/06 06:04:12.140635 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_Compression_Query - 2019/12/06 06:04:12.140721 [ERR] consul: failed to establish leadership: leadership lost while committing log
jones - 2019/12/06 06:04:12.203293 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:04:12.203393 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_AltDomains_SOA - 2019/12/06 06:04:12.212519 [DEBUG] tlsutil: Update with version 1
TestDNS_AltDomains_SOA - 2019/12/06 06:04:12.215091 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:04:12.249005 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:04:12.249093 [DEBUG] agent: Node info in sync
2019/12/06 06:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:966cb2ef-af6d-2452-b6a7-527bedebfd9b Address:127.0.0.1:34474}]
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34474 [Follower] entering Follower state (Leader: "")
2019/12/06 06:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:67da43c6-7c91-1058-43b9-3e30bf667aab Address:127.0.0.1:34462}]
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34462 [Follower] entering Follower state (Leader: "")
2019/12/06 06:04:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:40011720-c2e6-ee14-05f9-2df4d38ddc56 Address:127.0.0.1:34468}]
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.365652 [INFO] serf: EventMemberJoin: test-node.dc1 127.0.0.1
TestDNS_InvalidQueries - 2019/12/06 06:04:12.367256 [INFO] serf: EventMemberJoin: Node 67da43c6-7c91-1058-43b9-3e30bf667aab.dc1 127.0.0.1
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.368465 [INFO] serf: EventMemberJoin: Node 40011720-c2e6-ee14-05f9-2df4d38ddc56.dc1 127.0.0.1
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.368971 [INFO] serf: EventMemberJoin: test-node 127.0.0.1
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34468 [Follower] entering Follower state (Leader: "")
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.370650 [INFO] consul: Adding LAN server test-node (Addr: tcp/127.0.0.1:34474) (DC: dc1)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.370666 [INFO] consul: Handled member-join event for server "test-node.dc1" in area "wan"
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.371922 [INFO] agent: Started DNS server 127.0.0.1:34469 (udp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.372354 [INFO] agent: Started DNS server 127.0.0.1:34469 (tcp)
TestDNS_InvalidQueries - 2019/12/06 06:04:12.374360 [INFO] serf: EventMemberJoin: Node 67da43c6-7c91-1058-43b9-3e30bf667aab 127.0.0.1
TestDNS_InvalidQueries - 2019/12/06 06:04:12.375516 [INFO] consul: Adding LAN server Node 67da43c6-7c91-1058-43b9-3e30bf667aab (Addr: tcp/127.0.0.1:34462) (DC: dc1)
TestDNS_InvalidQueries - 2019/12/06 06:04:12.375540 [INFO] consul: Handled member-join event for server "Node 67da43c6-7c91-1058-43b9-3e30bf667aab.dc1" in area "wan"
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.376087 [INFO] agent: Started HTTP server on 127.0.0.1:34470 (tcp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.376206 [INFO] agent: started state syncer
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.379022 [INFO] serf: EventMemberJoin: Node 40011720-c2e6-ee14-05f9-2df4d38ddc56 127.0.0.1
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.380648 [INFO] agent: Started DNS server 127.0.0.1:34463 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.382822 [INFO] agent: Started DNS server 127.0.0.1:34463 (tcp)
TestDNS_InvalidQueries - 2019/12/06 06:04:12.381892 [INFO] agent: Started DNS server 127.0.0.1:34457 (tcp)
TestDNS_InvalidQueries - 2019/12/06 06:04:12.384412 [INFO] agent: Started DNS server 127.0.0.1:34457 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:12.386779 [INFO] agent: Started HTTP server on 127.0.0.1:34458 (tcp)
TestDNS_InvalidQueries - 2019/12/06 06:04:12.386874 [INFO] agent: started state syncer
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.381807 [INFO] consul: Adding LAN server Node 40011720-c2e6-ee14-05f9-2df4d38ddc56 (Addr: tcp/127.0.0.1:34468) (DC: dc1)
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.381979 [INFO] consul: Handled member-join event for server "Node 40011720-c2e6-ee14-05f9-2df4d38ddc56.dc1" in area "wan"
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.389072 [INFO] agent: Started HTTP server on 127.0.0.1:34464 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.389243 [INFO] agent: started state syncer
2019/12/06 06:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34468 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34462 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34474 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:04:12.617258 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:12 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34474 [Leader] entering Leader state
2019/12/06 06:04:12 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34468 [Leader] entering Leader state
2019/12/06 06:04:12 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:12 [INFO]  raft: Node at 127.0.0.1:34462 [Leader] entering Leader state
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.884712 [INFO] consul: cluster leadership acquired
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:12.885073 [INFO] consul: New leader elected: test-node
TestDNS_InvalidQueries - 2019/12/06 06:04:12.888622 [INFO] consul: cluster leadership acquired
TestDNS_InvalidQueries - 2019/12/06 06:04:12.889037 [INFO] consul: New leader elected: Node 67da43c6-7c91-1058-43b9-3e30bf667aab
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.889344 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:12.889674 [INFO] consul: New leader elected: Node 40011720-c2e6-ee14-05f9-2df4d38ddc56
2019/12/06 06:04:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ba0a08a0-596d-8ac3-1c8b-3cb49e4b32f7 Address:127.0.0.1:34480}]
2019/12/06 06:04:13 [INFO]  raft: Node at 127.0.0.1:34480 [Follower] entering Follower state (Leader: "")
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.022268 [INFO] serf: EventMemberJoin: test-node.dc1 127.0.0.1
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.031894 [INFO] serf: EventMemberJoin: test-node 127.0.0.1
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.033924 [INFO] consul: Adding LAN server test-node (Addr: tcp/127.0.0.1:34480) (DC: dc1)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.035641 [INFO] consul: Handled member-join event for server "test-node.dc1" in area "wan"
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.038198 [INFO] agent: Started DNS server 127.0.0.1:34475 (tcp)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.038446 [INFO] agent: Started DNS server 127.0.0.1:34475 (udp)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.041426 [INFO] agent: Started HTTP server on 127.0.0.1:34476 (tcp)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.041540 [INFO] agent: started state syncer
2019/12/06 06:04:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:13 [INFO]  raft: Node at 127.0.0.1:34480 [Candidate] entering Candidate state in term 2
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.229591 [INFO] agent: Synced node info
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.235225 [WARN] consul: endpoint injected; this should only be used for testing
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.236619 [WARN] dns: Query results too stale, re-requesting
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.237039 [DEBUG] dns: request for name nope.query.consul. type SRV class IN (took 596.014µs) from client 127.0.0.1:35473 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.237289 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.237357 [INFO] consul: shutting down server
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.237402 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.303172 [INFO] agent: Synced node info
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.303307 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.307240 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.312312 [DEBUG] dns: request for name test-node.node.consul. type A class IN (took 599.68µs) from client 127.0.0.1:56746 (udp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.313759 [DEBUG] dns: request for name test-node.node.test.consul. type A class IN (took 568.346µs) from client 127.0.0.1:53161 (udp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.316245 [DEBUG] dns: request for name test-node.node.dc1.consul. type A class IN (took 686.683µs) from client 127.0.0.1:58977 (udp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.317763 [DEBUG] dns: request for name test-node.node.dc1.test.consul. type A class IN (took 592.681µs) from client 127.0.0.1:39883 (udp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.317996 [INFO] agent: Requesting shutdown
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.318054 [INFO] consul: shutting down server
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.318097 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.463769 [INFO] manager: shutting down
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.463867 [WARN] serf: Shutdown without a Leave
2019/12/06 06:04:13 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:13 [INFO]  raft: Node at 127.0.0.1:34480 [Leader] entering Leader state
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.612314 [INFO] manager: shutting down
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.612495 [INFO] consul: cluster leadership acquired
TestDNS_AltDomains_SOA - 2019/12/06 06:04:13.612864 [INFO] consul: New leader elected: test-node
TestDNS_InvalidQueries - 2019/12/06 06:04:13.614091 [INFO] agent: Synced node info
TestEventFire_token - 2019/12/06 06:04:13.619406 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_InvalidQueries - 2019/12/06 06:04:13.628572 [WARN] dns: QName invalid: 
TestDNS_InvalidQueries - 2019/12/06 06:04:13.628996 [DEBUG] dns: request for name consul. type SRV class IN (took 384.009µs) from client 127.0.0.1:59361 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:13.630723 [WARN] dns: QName invalid: node.
TestDNS_InvalidQueries - 2019/12/06 06:04:13.631201 [DEBUG] dns: request for name node.consul. type SRV class IN (took 425.344µs) from client 127.0.0.1:43443 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:13.632019 [WARN] dns: QName invalid: service.
TestDNS_InvalidQueries - 2019/12/06 06:04:13.632412 [DEBUG] dns: request for name service.consul. type SRV class IN (took 363.008µs) from client 127.0.0.1:41237 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:13.632969 [WARN] dns: QName invalid: query.
TestDNS_InvalidQueries - 2019/12/06 06:04:13.633675 [DEBUG] dns: request for name query.consul. type SRV class IN (took 664.682µs) from client 127.0.0.1:46546 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:13.634279 [WARN] dns: QName invalid: foo.node.dc1.extra.
TestDNS_InvalidQueries - 2019/12/06 06:04:13.634692 [DEBUG] dns: request for name foo.node.dc1.extra.consul. type SRV class IN (took 366.008µs) from client 127.0.0.1:41617 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:13.635219 [WARN] dns: QName invalid: foo.service.dc1.extra.
TestDNS_InvalidQueries - 2019/12/06 06:04:13.635600 [DEBUG] dns: request for name foo.service.dc1.extra.consul. type SRV class IN (took 356.008µs) from client 127.0.0.1:54176 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:13.636167 [WARN] dns: QName invalid: foo.query.dc1.extra.
TestDNS_InvalidQueries - 2019/12/06 06:04:13.636546 [INFO] agent: Requesting shutdown
TestDNS_InvalidQueries - 2019/12/06 06:04:13.636569 [DEBUG] dns: request for name foo.query.dc1.extra.consul. type SRV class IN (took 377.009µs) from client 127.0.0.1:54461 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:13.636625 [INFO] consul: shutting down server
TestDNS_InvalidQueries - 2019/12/06 06:04:13.636705 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.702443 [INFO] agent: consul server down
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.702536 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.702600 [INFO] agent: Stopping DNS server 127.0.0.1:34463 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.702811 [INFO] agent: Stopping DNS server 127.0.0.1:34463 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.703034 [INFO] agent: Stopping HTTP server 127.0.0.1:34464 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.703254 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.703335 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_AllowStale (2.39s)
=== CONT  TestDNS_AltDomains_Service
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.705254 [INFO] agent: consul server down
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.705322 [INFO] agent: shutdown complete
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.705377 [INFO] agent: Stopping DNS server 127.0.0.1:34469 (tcp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.705505 [INFO] agent: Stopping DNS server 127.0.0.1:34469 (udp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.705648 [INFO] agent: Stopping HTTP server 127.0.0.1:34470 (tcp)
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.705828 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.705890 [INFO] agent: Endpoints down
--- PASS: TestDNS_AltDomains_Overlap (2.21s)
=== CONT  TestDNS_NonExistingLookupEmptyAorAAAA
TestDNS_PreparedQuery_AllowStale - 2019/12/06 06:04:13.707527 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_AltDomains_Overlap - 2019/12/06 06:04:13.707778 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_InvalidQueries - 2019/12/06 06:04:13.768750 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_AltDomains_Service - 2019/12/06 06:04:13.774644 [WARN] agent: Node name "Node 4f0d99a0-e07b-a785-d9ca-8356ba546b15" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_AltDomains_Service - 2019/12/06 06:04:13.775218 [DEBUG] tlsutil: Update with version 1
TestDNS_AltDomains_Service - 2019/12/06 06:04:13.777373 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:13.808314 [WARN] agent: Node name "Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:13.808869 [DEBUG] tlsutil: Update with version 1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:13.816527 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_InvalidQueries - 2019/12/06 06:04:13.854798 [INFO] manager: shutting down
jones - 2019/12/06 06:04:14.000062 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:04:14.000150 [DEBUG] agent: Node info in sync
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.103070 [INFO] agent: Synced node info
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.103190 [DEBUG] agent: Node info in sync
TestDNS_InvalidQueries - 2019/12/06 06:04:14.104075 [INFO] agent: consul server down
TestDNS_InvalidQueries - 2019/12/06 06:04:14.104130 [INFO] agent: shutdown complete
TestDNS_InvalidQueries - 2019/12/06 06:04:14.104270 [INFO] agent: Stopping DNS server 127.0.0.1:34457 (tcp)
TestDNS_InvalidQueries - 2019/12/06 06:04:14.104421 [INFO] agent: Stopping DNS server 127.0.0.1:34457 (udp)
TestDNS_InvalidQueries - 2019/12/06 06:04:14.104575 [INFO] agent: Stopping HTTP server 127.0.0.1:34458 (tcp)
TestDNS_InvalidQueries - 2019/12/06 06:04:14.105036 [INFO] agent: Waiting for endpoints to shut down
TestDNS_InvalidQueries - 2019/12/06 06:04:14.105123 [INFO] agent: Endpoints down
--- PASS: TestDNS_InvalidQueries (2.87s)
=== CONT  TestDNS_NonExistingLookup
TestDNS_InvalidQueries - 2019/12/06 06:04:14.105739 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_InvalidQueries - 2019/12/06 06:04:14.105951 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_InvalidQueries - 2019/12/06 06:04:14.106016 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_InvalidQueries - 2019/12/06 06:04:14.106068 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_InvalidQueries - 2019/12/06 06:04:14.106116 [ERR] consul: failed to transfer leadership in 3 attempts
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.115207 [WARN] dns: no servers found
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.115625 [DEBUG] dns: request for name test-node.node.consul. type SOA class IN (took 908.355µs) from client 127.0.0.1:36529 (udp)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.117125 [WARN] dns: no servers found
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.117485 [DEBUG] dns: request for name test-node.node.test-domain. type SOA class IN (took 919.022µs) from client 127.0.0.1:47229 (udp)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.118046 [INFO] agent: Requesting shutdown
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.119330 [INFO] consul: shutting down server
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.120662 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NonExistingLookup - 2019/12/06 06:04:14.206186 [WARN] agent: Node name "Node 4d4dc463-94e3-66eb-43f8-a526897b6cc9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NonExistingLookup - 2019/12/06 06:04:14.207788 [DEBUG] tlsutil: Update with version 1
TestDNS_NonExistingLookup - 2019/12/06 06:04:14.211530 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.285415 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.363833 [INFO] manager: shutting down
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.364419 [INFO] agent: consul server down
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.364480 [INFO] agent: shutdown complete
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.364540 [INFO] agent: Stopping DNS server 127.0.0.1:34475 (tcp)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.364693 [INFO] agent: Stopping DNS server 127.0.0.1:34475 (udp)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.364862 [INFO] agent: Stopping HTTP server 127.0.0.1:34476 (tcp)
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.365088 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.365163 [INFO] agent: Endpoints down
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.365218 [ERR] autopilot: failed to initialize config: leadership lost while committing log
--- PASS: TestDNS_AltDomains_SOA (2.23s)
=== CONT  TestDNS_AddressLookup
TestDNS_AltDomains_SOA - 2019/12/06 06:04:14.365512 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_AddressLookup - 2019/12/06 06:04:14.466049 [WARN] agent: Node name "Node 24494521-5d12-e866-2622-1dde1898bb85" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_AddressLookup - 2019/12/06 06:04:14.466498 [DEBUG] tlsutil: Update with version 1
TestDNS_AddressLookup - 2019/12/06 06:04:14.468667 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:04:14.617620 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4f0d99a0-e07b-a785-d9ca-8356ba546b15 Address:127.0.0.1:34486}]
2019/12/06 06:04:14 [INFO]  raft: Node at 127.0.0.1:34486 [Follower] entering Follower state (Leader: "")
TestDNS_AltDomains_Service - 2019/12/06 06:04:14.888678 [INFO] serf: EventMemberJoin: Node 4f0d99a0-e07b-a785-d9ca-8356ba546b15.dc1 127.0.0.1
TestDNS_AltDomains_Service - 2019/12/06 06:04:14.894852 [INFO] serf: EventMemberJoin: Node 4f0d99a0-e07b-a785-d9ca-8356ba546b15 127.0.0.1
TestDNS_AltDomains_Service - 2019/12/06 06:04:14.896534 [INFO] consul: Adding LAN server Node 4f0d99a0-e07b-a785-d9ca-8356ba546b15 (Addr: tcp/127.0.0.1:34486) (DC: dc1)
TestDNS_AltDomains_Service - 2019/12/06 06:04:14.897813 [INFO] consul: Handled member-join event for server "Node 4f0d99a0-e07b-a785-d9ca-8356ba546b15.dc1" in area "wan"
TestDNS_AltDomains_Service - 2019/12/06 06:04:14.916780 [INFO] agent: Started DNS server 127.0.0.1:34481 (tcp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:14.916890 [INFO] agent: Started DNS server 127.0.0.1:34481 (udp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:14.939133 [INFO] agent: Started HTTP server on 127.0.0.1:34482 (tcp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:14.941167 [INFO] agent: started state syncer
2019/12/06 06:04:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:14 [INFO]  raft: Node at 127.0.0.1:34486 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf Address:127.0.0.1:34492}]
2019/12/06 06:04:15 [INFO]  raft: Node at 127.0.0.1:34492 [Follower] entering Follower state (Leader: "")
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.085564 [INFO] serf: EventMemberJoin: Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf.dc1 127.0.0.1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.090499 [INFO] serf: EventMemberJoin: Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf 127.0.0.1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.092636 [INFO] consul: Handled member-join event for server "Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf.dc1" in area "wan"
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.093063 [INFO] consul: Adding LAN server Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf (Addr: tcp/127.0.0.1:34492) (DC: dc1)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.093834 [INFO] agent: Started DNS server 127.0.0.1:34487 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.094558 [INFO] agent: Started DNS server 127.0.0.1:34487 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.098672 [INFO] agent: Started HTTP server on 127.0.0.1:34488 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.098769 [INFO] agent: started state syncer
2019/12/06 06:04:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:15 [INFO]  raft: Node at 127.0.0.1:34492 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:24494521-5d12-e866-2622-1dde1898bb85 Address:127.0.0.1:34504}]
2019/12/06 06:04:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4d4dc463-94e3-66eb-43f8-a526897b6cc9 Address:127.0.0.1:34498}]
2019/12/06 06:04:15 [INFO]  raft: Node at 127.0.0.1:34498 [Follower] entering Follower state (Leader: "")
2019/12/06 06:04:15 [INFO]  raft: Node at 127.0.0.1:34504 [Follower] entering Follower state (Leader: "")
TestDNS_NonExistingLookup - 2019/12/06 06:04:15.448212 [INFO] serf: EventMemberJoin: Node 4d4dc463-94e3-66eb-43f8-a526897b6cc9.dc1 127.0.0.1
TestDNS_AddressLookup - 2019/12/06 06:04:15.450579 [INFO] serf: EventMemberJoin: Node 24494521-5d12-e866-2622-1dde1898bb85.dc1 127.0.0.1
TestDNS_NonExistingLookup - 2019/12/06 06:04:15.451339 [INFO] serf: EventMemberJoin: Node 4d4dc463-94e3-66eb-43f8-a526897b6cc9 127.0.0.1
TestDNS_NonExistingLookup - 2019/12/06 06:04:15.452506 [INFO] agent: Started DNS server 127.0.0.1:34493 (udp)
TestDNS_NonExistingLookup - 2019/12/06 06:04:15.453800 [INFO] consul: Handled member-join event for server "Node 4d4dc463-94e3-66eb-43f8-a526897b6cc9.dc1" in area "wan"
TestDNS_NonExistingLookup - 2019/12/06 06:04:15.453928 [INFO] consul: Adding LAN server Node 4d4dc463-94e3-66eb-43f8-a526897b6cc9 (Addr: tcp/127.0.0.1:34498) (DC: dc1)
TestDNS_AddressLookup - 2019/12/06 06:04:15.454359 [INFO] serf: EventMemberJoin: Node 24494521-5d12-e866-2622-1dde1898bb85 127.0.0.1
TestDNS_NonExistingLookup - 2019/12/06 06:04:15.454568 [INFO] agent: Started DNS server 127.0.0.1:34493 (tcp)
TestDNS_AddressLookup - 2019/12/06 06:04:15.455541 [INFO] agent: Started DNS server 127.0.0.1:34499 (udp)
TestDNS_AddressLookup - 2019/12/06 06:04:15.455926 [INFO] agent: Started DNS server 127.0.0.1:34499 (tcp)
TestDNS_NonExistingLookup - 2019/12/06 06:04:15.456786 [INFO] agent: Started HTTP server on 127.0.0.1:34494 (tcp)
TestDNS_NonExistingLookup - 2019/12/06 06:04:15.456857 [INFO] agent: started state syncer
TestDNS_AddressLookup - 2019/12/06 06:04:15.457808 [INFO] consul: Adding LAN server Node 24494521-5d12-e866-2622-1dde1898bb85 (Addr: tcp/127.0.0.1:34504) (DC: dc1)
TestDNS_AddressLookup - 2019/12/06 06:04:15.458118 [INFO] consul: Handled member-join event for server "Node 24494521-5d12-e866-2622-1dde1898bb85.dc1" in area "wan"
TestDNS_AddressLookup - 2019/12/06 06:04:15.458670 [INFO] agent: Started HTTP server on 127.0.0.1:34500 (tcp)
TestDNS_AddressLookup - 2019/12/06 06:04:15.458769 [INFO] agent: started state syncer
2019/12/06 06:04:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:15 [INFO]  raft: Node at 127.0.0.1:34504 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:15 [INFO]  raft: Node at 127.0.0.1:34498 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:04:15.617270 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:15 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:15 [INFO]  raft: Node at 127.0.0.1:34486 [Leader] entering Leader state
2019/12/06 06:04:15 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:15 [INFO]  raft: Node at 127.0.0.1:34492 [Leader] entering Leader state
TestDNS_AltDomains_Service - 2019/12/06 06:04:15.705099 [INFO] consul: cluster leadership acquired
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.705203 [INFO] consul: cluster leadership acquired
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:15.705556 [INFO] consul: New leader elected: Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf
TestDNS_AltDomains_Service - 2019/12/06 06:04:15.705556 [INFO] consul: New leader elected: Node 4f0d99a0-e07b-a785-d9ca-8356ba546b15
2019/12/06 06:04:16 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:16 [INFO]  raft: Node at 127.0.0.1:34498 [Leader] entering Leader state
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.003075 [INFO] agent: Synced node info
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:16.003077 [INFO] agent: Synced node info
2019/12/06 06:04:16 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:16 [INFO]  raft: Node at 127.0.0.1:34504 [Leader] entering Leader state
TestDNS_AddressLookup - 2019/12/06 06:04:16.008432 [INFO] consul: cluster leadership acquired
TestDNS_AddressLookup - 2019/12/06 06:04:16.009031 [INFO] consul: New leader elected: Node 24494521-5d12-e866-2622-1dde1898bb85
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.010483 [INFO] consul: cluster leadership acquired
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.010900 [INFO] consul: New leader elected: Node 4d4dc463-94e3-66eb-43f8-a526897b6cc9
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.278005 [INFO] agent: Synced node info
TestDNS_AddressLookup - 2019/12/06 06:04:16.278005 [INFO] agent: Synced node info
TestDNS_AddressLookup - 2019/12/06 06:04:16.278218 [DEBUG] agent: Node info in sync
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.283080 [WARN] dns: QName invalid: nonexisting.
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.283559 [DEBUG] dns: request for name nonexisting.consul. type ANY class IN (took 441.01µs) from client 127.0.0.1:35037 (udp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.283080 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 917.688µs) from client 127.0.0.1:46123 (udp)
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.283871 [INFO] agent: Requesting shutdown
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.283948 [INFO] consul: shutting down server
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.283993 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.285022 [DEBUG] dns: request for name db.service.test-domain. type SRV class IN (took 1.015024ms) from client 127.0.0.1:45990 (udp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.287717 [DEBUG] dns: request for name db.service.dc1.consul. type SRV class IN (took 997.356µs) from client 127.0.0.1:34703 (udp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.289635 [DEBUG] dns: request for name db.service.dc1.test-domain. type SRV class IN (took 1.02669ms) from client 127.0.0.1:57009 (udp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.290279 [INFO] agent: Requesting shutdown
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.290361 [INFO] consul: shutting down server
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.290468 [WARN] serf: Shutdown without a Leave
TestDNS_AddressLookup - 2019/12/06 06:04:16.291893 [DEBUG] dns: request for name 7f000001.addr.dc1.consul. type SRV class IN (took 319.674µs) from client 127.0.0.1:34695 (udp)
TestDNS_AddressLookup - 2019/12/06 06:04:16.292209 [INFO] agent: Requesting shutdown
TestDNS_AddressLookup - 2019/12/06 06:04:16.292291 [INFO] consul: shutting down server
TestDNS_AddressLookup - 2019/12/06 06:04:16.292341 [WARN] serf: Shutdown without a Leave
TestDNS_AddressLookup - 2019/12/06 06:04:16.410532 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.410533 [WARN] serf: Shutdown without a Leave
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.411894 [WARN] serf: Shutdown without a Leave
TestDNS_AddressLookup - 2019/12/06 06:04:16.485600 [INFO] manager: shutting down
TestDNS_AddressLookup - 2019/12/06 06:04:16.485617 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestDNS_AddressLookup - 2019/12/06 06:04:16.485997 [INFO] agent: consul server down
TestDNS_AddressLookup - 2019/12/06 06:04:16.486051 [INFO] agent: shutdown complete
TestDNS_AddressLookup - 2019/12/06 06:04:16.486112 [INFO] agent: Stopping DNS server 127.0.0.1:34499 (tcp)
TestDNS_AddressLookup - 2019/12/06 06:04:16.486060 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_AddressLookup - 2019/12/06 06:04:16.486260 [INFO] agent: Stopping DNS server 127.0.0.1:34499 (udp)
TestDNS_AddressLookup - 2019/12/06 06:04:16.486441 [INFO] agent: Stopping HTTP server 127.0.0.1:34500 (tcp)
TestDNS_AddressLookup - 2019/12/06 06:04:16.486674 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AddressLookup - 2019/12/06 06:04:16.486760 [INFO] agent: Endpoints down
--- PASS: TestDNS_AddressLookup (2.12s)
=== CONT  TestDNS_ServiceLookup_FilterACL
=== RUN   TestDNS_ServiceLookup_FilterACL/ACLToken_==_root
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.487870 [INFO] manager: shutting down
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.488829 [INFO] manager: shutting down
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:16.498513 [DEBUG] agent: Node info in sync
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:16.498676 [DEBUG] agent: Node info in sync
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.552648 [INFO] agent: consul server down
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.552787 [INFO] agent: shutdown complete
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.552863 [INFO] agent: Stopping DNS server 127.0.0.1:34481 (tcp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.553094 [INFO] agent: Stopping DNS server 127.0.0.1:34481 (udp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.553294 [INFO] agent: Stopping HTTP server 127.0.0.1:34482 (tcp)
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.553515 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.553580 [INFO] agent: Endpoints down
--- PASS: TestDNS_AltDomains_Service (2.85s)
=== CONT  TestDNS_ServiceLookup_SRV_RFC_TCP_Default
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:16.562732 [WARN] agent: Node name "Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:16.563121 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:16.565327 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_AltDomains_Service - 2019/12/06 06:04:16.571527 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestEventFire_token - 2019/12/06 06:04:16.617327 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:16.629525 [WARN] agent: Node name "Node 02b28716-31bd-c15f-32d3-49d8fe26a1f6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:16.630180 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:16.633379 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.643952 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.644292 [INFO] agent: consul server down
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.644883 [INFO] agent: shutdown complete
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.644977 [INFO] agent: Stopping DNS server 127.0.0.1:34493 (tcp)
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.645186 [INFO] agent: Stopping DNS server 127.0.0.1:34493 (udp)
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.645376 [INFO] agent: Stopping HTTP server 127.0.0.1:34494 (tcp)
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.645651 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NonExistingLookup - 2019/12/06 06:04:16.645742 [INFO] agent: Endpoints down
--- PASS: TestDNS_NonExistingLookup (2.54s)
=== CONT  TestDNS_ServiceLookup_SRV_RFC
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:16.724092 [WARN] agent: Node name "Node 0fa97565-6a08-21df-7c08-698d569c1785" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:16.724981 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:16.727750 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:04:17.078004 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:04:17.078098 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/12/06 06:04:17.078140 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:04:17.617368 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.708561 [DEBUG] dns: request for name webv4.service.consul. type AAAA class IN (took 936.022µs) from client 127.0.0.1:60222 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.710633 [DEBUG] dns: request for name webv4.query.consul. type AAAA class IN (took 988.69µs) from client 127.0.0.1:37125 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.726467 [DEBUG] dns: request for name webv6.query.consul. type A class IN (took 2.682395ms) from client 127.0.0.1:52800 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.726555 [INFO] agent: Requesting shutdown
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.726637 [INFO] consul: shutting down server
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.726689 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.728394 [DEBUG] dns: request for name webv6.service.consul. type A class IN (took 13.906322ms) from client 127.0.0.1:47237 (udp)
2019/12/06 06:04:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3dbc3089-e42b-6fa0-9a0f-2ecc8d413577 Address:127.0.0.1:34510}]
2019/12/06 06:04:17 [INFO]  raft: Node at 127.0.0.1:34510 [Follower] entering Follower state (Leader: "")
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.787523 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:17.795458 [INFO] serf: EventMemberJoin: Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577.dc1 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:17.803017 [INFO] serf: EventMemberJoin: Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:17.804836 [INFO] consul: Adding LAN server Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577 (Addr: tcp/127.0.0.1:34510) (DC: dc1)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:17.804972 [INFO] consul: Handled member-join event for server "Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577.dc1" in area "wan"
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:17.807436 [INFO] agent: Started DNS server 127.0.0.1:34505 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:17.807524 [INFO] agent: Started DNS server 127.0.0.1:34505 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:17.810058 [INFO] agent: Started HTTP server on 127.0.0.1:34506 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:17.810170 [INFO] agent: started state syncer
2019/12/06 06:04:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:17 [INFO]  raft: Node at 127.0.0.1:34510 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:02b28716-31bd-c15f-32d3-49d8fe26a1f6 Address:127.0.0.1:34516}]
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.886669 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.887116 [DEBUG] consul: Skipping self join check for "Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf" since the cluster is too small
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.887306 [INFO] consul: member 'Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf' joined, marking health alive
2019/12/06 06:04:17 [INFO]  raft: Node at 127.0.0.1:34516 [Follower] entering Follower state (Leader: "")
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:17.891401 [INFO] manager: shutting down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:17.895065 [INFO] serf: EventMemberJoin: Node 02b28716-31bd-c15f-32d3-49d8fe26a1f6.dc1 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:17.903008 [INFO] serf: EventMemberJoin: Node 02b28716-31bd-c15f-32d3-49d8fe26a1f6 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:17.904873 [INFO] consul: Adding LAN server Node 02b28716-31bd-c15f-32d3-49d8fe26a1f6 (Addr: tcp/127.0.0.1:34516) (DC: dc1)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:17.905672 [INFO] consul: Handled member-join event for server "Node 02b28716-31bd-c15f-32d3-49d8fe26a1f6.dc1" in area "wan"
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:17.907394 [INFO] agent: Started DNS server 127.0.0.1:34511 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:17.907711 [INFO] agent: Started DNS server 127.0.0.1:34511 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:17.910719 [INFO] agent: Started HTTP server on 127.0.0.1:34512 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:17.910869 [INFO] agent: started state syncer
2019/12/06 06:04:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:17 [INFO]  raft: Node at 127.0.0.1:34516 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0fa97565-6a08-21df-7c08-698d569c1785 Address:127.0.0.1:34522}]
2019/12/06 06:04:18 [INFO]  raft: Node at 127.0.0.1:34522 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.042138 [INFO] serf: EventMemberJoin: Node 0fa97565-6a08-21df-7c08-698d569c1785.dc1 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.045925 [INFO] serf: EventMemberJoin: Node 0fa97565-6a08-21df-7c08-698d569c1785 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.046737 [INFO] consul: Adding LAN server Node 0fa97565-6a08-21df-7c08-698d569c1785 (Addr: tcp/127.0.0.1:34522) (DC: dc1)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.046777 [INFO] consul: Handled member-join event for server "Node 0fa97565-6a08-21df-7c08-698d569c1785.dc1" in area "wan"
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.047379 [INFO] agent: Started DNS server 127.0.0.1:34517 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.047465 [INFO] agent: Started DNS server 127.0.0.1:34517 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.050095 [INFO] agent: Started HTTP server on 127.0.0.1:34518 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.050220 [INFO] agent: started state syncer
2019/12/06 06:04:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:18 [INFO]  raft: Node at 127.0.0.1:34522 [Candidate] entering Candidate state in term 2
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:18.104674 [INFO] agent: consul server down
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:18.104769 [INFO] agent: shutdown complete
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:18.104833 [INFO] agent: Stopping DNS server 127.0.0.1:34487 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:18.104986 [INFO] agent: Stopping DNS server 127.0.0.1:34487 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:18.105157 [INFO] agent: Stopping HTTP server 127.0.0.1:34488 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:18.105380 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:18.105401 [ERR] consul: failed to reconcile member: {Node 4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf 127.0.0.1 34490 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:4e6349c5-e2dd-e0bb-aa79-b6c04a0c22cf port:34492 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:34491] alive 1 5 2 2 5 4}: leadership lost while committing log
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/12/06 06:04:18.105455 [INFO] agent: Endpoints down
--- PASS: TestDNS_NonExistingLookupEmptyAorAAAA (4.40s)
=== CONT  TestDNS_PreparedQuery_TTL
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:18.185119 [WARN] agent: Node name "Node d756a408-a9d3-12fa-1b8f-66e982671adf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:18.188934 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:18.191230 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:18 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:18 [INFO]  raft: Node at 127.0.0.1:34510 [Leader] entering Leader state
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:18.330943 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:18.331462 [INFO] consul: New leader elected: Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577
2019/12/06 06:04:18 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:18 [INFO]  raft: Node at 127.0.0.1:34516 [Leader] entering Leader state
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:18.467013 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:18.467526 [INFO] consul: New leader elected: Node 02b28716-31bd-c15f-32d3-49d8fe26a1f6
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:18.557517 [INFO] acl: initializing acls
TestEventFire_token - 2019/12/06 06:04:18.617372 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:18.672825 [ERR] agent: failed to sync remote state: ACL not found
2019/12/06 06:04:18 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:18 [INFO]  raft: Node at 127.0.0.1:34522 [Leader] entering Leader state
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.955425 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:18.956130 [INFO] consul: New leader elected: Node 0fa97565-6a08-21df-7c08-698d569c1785
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.171160 [INFO] consul: Created ACL 'global-management' policy
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.171288 [WARN] consul: Configuring a non-UUID master token is deprecated
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.173042 [INFO] acl: initializing acls
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.174737 [WARN] consul: Configuring a non-UUID master token is deprecated
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.179646 [INFO] agent: Synced node info
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.179779 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.388988 [INFO] consul: Bootstrapped ACL master token from configuration
jones - 2019/12/06 06:04:19.432502 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:04:19.432595 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:19.581086 [INFO] agent: Synced node info
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:19.581204 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:19.596540 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:04:19.617230 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.680759 [DEBUG] dns: request for name _db._tcp.service.dc1.consul. type SRV class IN (took 869.02µs) from client 127.0.0.1:59601 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.683214 [DEBUG] dns: request for name _db._tcp.service.consul. type SRV class IN (took 782.352µs) from client 127.0.0.1:45716 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.684987 [DEBUG] dns: request for name _db._tcp.dc1.consul. type SRV class IN (took 828.686µs) from client 127.0.0.1:60890 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.686629 [DEBUG] dns: request for name _db._tcp.consul. type SRV class IN (took 784.685µs) from client 127.0.0.1:45786 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.687125 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.687199 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.687264 [WARN] serf: Shutdown without a Leave
2019/12/06 06:04:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d756a408-a9d3-12fa-1b8f-66e982671adf Address:127.0.0.1:34528}]
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:19.798236 [INFO] serf: EventMemberJoin: Node d756a408-a9d3-12fa-1b8f-66e982671adf.dc1 127.0.0.1
2019/12/06 06:04:19 [INFO]  raft: Node at 127.0.0.1:34528 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:19.809148 [INFO] serf: EventMemberJoin: Node d756a408-a9d3-12fa-1b8f-66e982671adf 127.0.0.1
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:19.811770 [INFO] consul: Adding LAN server Node d756a408-a9d3-12fa-1b8f-66e982671adf (Addr: tcp/127.0.0.1:34528) (DC: dc1)
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:19.812685 [INFO] consul: Handled member-join event for server "Node d756a408-a9d3-12fa-1b8f-66e982671adf.dc1" in area "wan"
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:19.818621 [INFO] agent: Started DNS server 127.0.0.1:34523 (tcp)
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:19.818740 [INFO] agent: Started DNS server 127.0.0.1:34523 (udp)
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:19.826249 [INFO] agent: Started HTTP server on 127.0.0.1:34524 (tcp)
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:19.826434 [INFO] agent: started state syncer
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.870245 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.870364 [DEBUG] acl: transitioning out of legacy ACL mode
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.871145 [INFO] serf: EventMemberUpdate: Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.871823 [INFO] serf: EventMemberUpdate: Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577.dc1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.871964 [INFO] consul: Bootstrapped ACL master token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.872748 [INFO] serf: EventMemberUpdate: Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577
2019/12/06 06:04:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:19 [INFO]  raft: Node at 127.0.0.1:34528 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:19.878467 [INFO] serf: EventMemberUpdate: Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577.dc1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:19.968820 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.069023 [INFO] manager: shutting down
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.078383 [DEBUG] dns: request for name _db._master.service.dc1.consul. type SRV class IN (took 678.683µs) from client 127.0.0.1:35523 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.080440 [DEBUG] dns: request for name _db._master.service.consul. type SRV class IN (took 662.015µs) from client 127.0.0.1:60317 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.081898 [DEBUG] dns: request for name _db._master.dc1.consul. type SRV class IN (took 594.013µs) from client 127.0.0.1:60489 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.084629 [DEBUG] dns: request for name _db._master.consul. type SRV class IN (took 755.018µs) from client 127.0.0.1:46669 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.084719 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.084784 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.084833 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.187990 [INFO] agent: consul server down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.188083 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.188149 [INFO] agent: Stopping DNS server 127.0.0.1:34511 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.188301 [INFO] agent: Stopping DNS server 127.0.0.1:34511 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.188492 [INFO] agent: Stopping HTTP server 127.0.0.1:34512 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.188759 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.188889 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SRV_RFC_TCP_Default (3.64s)
=== CONT  TestDNS_ServiceLookup_TTL
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.193012 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.194488 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/12/06 06:04:20.194719 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:20.318383 [WARN] agent: Node name "Node 12e02c79-7875-f78c-0512-39dcbc159216" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:20.323283 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:20.334512 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.335704 [INFO] manager: shutting down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:20.343228 [INFO] agent: Synced node info
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:20.343352 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:20.347996 [DEBUG] consul: dropping node "Node 3dbc3089-e42b-6fa0-9a0f-2ecc8d413577" from result due to ACLs
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.443977 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444288 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444365 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444420 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444473 [ERR] consul: failed to transfer leadership in 3 attempts
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444315 [INFO] agent: consul server down
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444568 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444619 [INFO] agent: Stopping DNS server 127.0.0.1:34517 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444770 [INFO] agent: Stopping DNS server 127.0.0.1:34517 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.444940 [INFO] agent: Stopping HTTP server 127.0.0.1:34518 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.445151 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SRV_RFC - 2019/12/06 06:04:20.445217 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SRV_RFC (3.80s)
=== CONT  TestDNS_NodeLookup_TTL
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:20.561350 [WARN] agent: Node name "Node eaad51fb-37a6-8b97-23c4-bea148f9edce" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:20.561825 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:20.563988 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:20 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:20 [INFO]  raft: Node at 127.0.0.1:34528 [Leader] entering Leader state
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:20.616383 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:20.616800 [INFO] consul: New leader elected: Node d756a408-a9d3-12fa-1b8f-66e982671adf
TestEventFire_token - 2019/12/06 06:04:20.617918 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:20.885307 [DEBUG] dns: request for name foo.service.consul. type A class IN (took 1.097026ms) from client 127.0.0.1:36435 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:20.885406 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:20.885483 [INFO] consul: shutting down server
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:20.885530 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:21.070452 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:21.203742 [INFO] agent: Synced node info
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:21.435626 [INFO] manager: shutting down
TestEventFire_token - 2019/12/06 06:04:21.617266 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:21.834083 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:21.834458 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:21.834529 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:22.216011 [INFO] agent: consul server down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:22.216105 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:22.216171 [INFO] agent: Stopping DNS server 127.0.0.1:34505 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:22.216331 [INFO] agent: Stopping DNS server 127.0.0.1:34505 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:22.216554 [INFO] agent: Stopping HTTP server 127.0.0.1:34506 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:22.216835 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/12/06 06:04:22.216911 [INFO] agent: Endpoints down
=== RUN   TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:22.310892 [WARN] agent: Node name "Node d7091dd6-564b-5152-4d76-12f3b2b975f9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:22.311337 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:22.313985 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:22.498235 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:22.498375 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:04:22.617502 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:04:23.617879 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:12e02c79-7875-f78c-0512-39dcbc159216 Address:127.0.0.1:34534}]
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:23.946029 [INFO] serf: EventMemberJoin: Node 12e02c79-7875-f78c-0512-39dcbc159216.dc1 127.0.0.1
2019/12/06 06:04:23 [INFO]  raft: Node at 127.0.0.1:34534 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:23.950666 [INFO] serf: EventMemberJoin: Node 12e02c79-7875-f78c-0512-39dcbc159216 127.0.0.1
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:23.951465 [INFO] consul: Adding LAN server Node 12e02c79-7875-f78c-0512-39dcbc159216 (Addr: tcp/127.0.0.1:34534) (DC: dc1)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:23.951744 [INFO] consul: Handled member-join event for server "Node 12e02c79-7875-f78c-0512-39dcbc159216.dc1" in area "wan"
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:23.952705 [INFO] agent: Started DNS server 127.0.0.1:34529 (udp)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:23.952775 [INFO] agent: Started DNS server 127.0.0.1:34529 (tcp)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:23.955441 [INFO] agent: Started HTTP server on 127.0.0.1:34530 (tcp)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:23.955562 [INFO] agent: started state syncer
2019/12/06 06:04:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:23 [INFO]  raft: Node at 127.0.0.1:34534 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eaad51fb-37a6-8b97-23c4-bea148f9edce Address:127.0.0.1:34540}]
2019/12/06 06:04:24 [INFO]  raft: Node at 127.0.0.1:34540 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.092143 [INFO] serf: EventMemberJoin: Node eaad51fb-37a6-8b97-23c4-bea148f9edce.dc1 127.0.0.1
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.103346 [INFO] serf: EventMemberJoin: Node eaad51fb-37a6-8b97-23c4-bea148f9edce 127.0.0.1
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.105887 [INFO] consul: Adding LAN server Node eaad51fb-37a6-8b97-23c4-bea148f9edce (Addr: tcp/127.0.0.1:34540) (DC: dc1)
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.106037 [INFO] consul: Handled member-join event for server "Node eaad51fb-37a6-8b97-23c4-bea148f9edce.dc1" in area "wan"
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.106854 [DEBUG] dns: recursor enabled
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.106978 [DEBUG] dns: recursor enabled
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.107870 [INFO] agent: Started DNS server 127.0.0.1:34535 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.107962 [INFO] agent: Started DNS server 127.0.0.1:34535 (udp)
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.110790 [INFO] agent: Started HTTP server on 127.0.0.1:34536 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.110927 [INFO] agent: started state syncer
2019/12/06 06:04:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:24 [INFO]  raft: Node at 127.0.0.1:34540 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:04:24.619113 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:24 [INFO]  raft: Node at 127.0.0.1:34534 [Leader] entering Leader state
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:24.736113 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:24.736798 [INFO] consul: New leader elected: Node 12e02c79-7875-f78c-0512-39dcbc159216
2019/12/06 06:04:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:24 [INFO]  raft: Node at 127.0.0.1:34540 [Leader] entering Leader state
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.902880 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:24.903377 [INFO] consul: New leader elected: Node eaad51fb-37a6-8b97-23c4-bea148f9edce
2019/12/06 06:04:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d7091dd6-564b-5152-4d76-12f3b2b975f9 Address:127.0.0.1:34546}]
2019/12/06 06:04:25 [INFO]  raft: Node at 127.0.0.1:34546 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.081966 [INFO] serf: EventMemberJoin: Node d7091dd6-564b-5152-4d76-12f3b2b975f9.dc1 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.085347 [INFO] serf: EventMemberJoin: Node d7091dd6-564b-5152-4d76-12f3b2b975f9 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.086217 [INFO] consul: Adding LAN server Node d7091dd6-564b-5152-4d76-12f3b2b975f9 (Addr: tcp/127.0.0.1:34546) (DC: dc1)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.086549 [INFO] consul: Handled member-join event for server "Node d7091dd6-564b-5152-4d76-12f3b2b975f9.dc1" in area "wan"
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.086662 [INFO] agent: Started DNS server 127.0.0.1:34541 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.087049 [INFO] agent: Started DNS server 127.0.0.1:34541 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.089751 [INFO] agent: Started HTTP server on 127.0.0.1:34542 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.090160 [INFO] agent: started state syncer
2019/12/06 06:04:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:25 [INFO]  raft: Node at 127.0.0.1:34546 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:25.156082 [INFO] agent: Synced node info
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:25.236811 [INFO] agent: Synced node info
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:25.236929 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:25.285174 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:25.441004 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventFire_token - 2019/12/06 06:04:25.617250 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:25.649638 [DEBUG] consul: Skipping self join check for "Node d756a408-a9d3-12fa-1b8f-66e982671adf" since the cluster is too small
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:25.650009 [INFO] consul: member 'Node d756a408-a9d3-12fa-1b8f-66e982671adf' joined, marking health alive
2019/12/06 06:04:25 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:25 [INFO]  raft: Node at 127.0.0.1:34546 [Leader] entering Leader state
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.800136 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.800684 [INFO] consul: New leader elected: Node d7091dd6-564b-5152-4d76-12f3b2b975f9
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.842061 [INFO] acl: initializing acls
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:25.947864 [ERR] agent: failed to sync remote state: ACL not found
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:25.974303 [DEBUG] dns: request for name foo.node.consul. type ANY class IN (took 490.012µs) from client 127.0.0.1:34409 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:26.129122 [INFO] consul: Created ACL 'global-management' policy
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:26.129278 [WARN] consul: Configuring a non-UUID master token is deprecated
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:26.134565 [INFO] acl: initializing acls
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:26.134714 [WARN] consul: Configuring a non-UUID master token is deprecated
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:26.195354 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.280685 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 539.679µs) from client 127.0.0.1:39198 (udp)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.486460 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.488617 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:04:26.617153 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:26.696466 [INFO] consul: Bootstrapped ACL master token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:26.703694 [INFO] consul: Bootstrapped ACL master token from configuration
=== RUN   TestDNS_ServiceLookup_TTL/db.service.consul.
=== RUN   TestDNS_ServiceLookup_TTL/dblb.service.consul.
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.710416 [DEBUG] dns: request for name dblb.service.consul. type SRV class IN (took 843.353µs) from client 127.0.0.1:51452 (udp)
=== RUN   TestDNS_ServiceLookup_TTL/dk.service.consul.
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.710863 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 3.03907ms) from client 127.0.0.1:38790 (udp)
=== RUN   TestDNS_ServiceLookup_TTL/api.service.consul.
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.712591 [DEBUG] dns: request for name dk.service.consul. type SRV class IN (took 1.019357ms) from client 127.0.0.1:43018 (udp)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.714819 [DEBUG] dns: request for name api.service.consul. type SRV class IN (took 902.687µs) from client 127.0.0.1:48039 (udp)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.715071 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.715899 [INFO] consul: shutting down server
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.716059 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.869897 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.977405 [INFO] manager: shutting down
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.977695 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.977763 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.978020 [INFO] agent: consul server down
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.978073 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.978130 [INFO] agent: Stopping DNS server 127.0.0.1:34529 (tcp)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.978295 [INFO] agent: Stopping DNS server 127.0.0.1:34529 (udp)
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.978562 [INFO] agent: Stopping HTTP server 127.0.0.1:34530 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.978627 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.978802 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_TTL - 2019/12/06 06:04:26.978883 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_TTL (6.79s)
    --- PASS: TestDNS_ServiceLookup_TTL/db.service.consul. (0.00s)
    --- PASS: TestDNS_ServiceLookup_TTL/dblb.service.consul. (0.00s)
    --- PASS: TestDNS_ServiceLookup_TTL/dk.service.consul. (0.00s)
    --- PASS: TestDNS_ServiceLookup_TTL/api.service.consul. (0.00s)
=== CONT  TestDNS_ServiceLookup_AnswerLimits
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.979158 [DEBUG] consul: Skipping self join check for "Node eaad51fb-37a6-8b97-23c4-bea148f9edce" since the cluster is too small
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{0_0_0_0_0_0_0_0_0_0_0}
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.979367 [INFO] consul: member 'Node eaad51fb-37a6-8b97-23c4-bea148f9edce' joined, marking health alive
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.983563 [DEBUG] dns: cname recurse RTT for www.google.com. (556.013µs)
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.983828 [DEBUG] dns: request for name google.node.consul. type ANY class IN (took 1.490701ms) from client 127.0.0.1:56338 (udp)
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.984016 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.984070 [INFO] consul: shutting down server
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:26.984115 [WARN] serf: Shutdown without a Leave
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== CONT  TestDNS_ServiceLookup_LargeResponses
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:27.045112 [WARN] agent: Node name "Node 352eeab4-3b28-201f-2e74-b33c71d6411b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:27.045575 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:27.047680 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:27.162152 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.162776 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:27.162975 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:27.163035 [DEBUG] acl: transitioning out of legacy ACL mode
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:27.163113 [INFO] serf: EventMemberUpdate: Node d7091dd6-564b-5152-4d76-12f3b2b975f9
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:27.163797 [INFO] serf: EventMemberUpdate: Node d7091dd6-564b-5152-4d76-12f3b2b975f9.dc1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:27.164780 [INFO] serf: EventMemberUpdate: Node d7091dd6-564b-5152-4d76-12f3b2b975f9
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:27.166390 [INFO] serf: EventMemberUpdate: Node d7091dd6-564b-5152-4d76-12f3b2b975f9.dc1
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.252374 [INFO] manager: shutting down
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.254726 [INFO] agent: consul server down
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.254803 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.254896 [INFO] agent: Stopping DNS server 127.0.0.1:34535 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.255204 [INFO] agent: Stopping DNS server 127.0.0.1:34535 (udp)
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.255472 [INFO] agent: Stopping HTTP server 127.0.0.1:34536 (tcp)
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.255742 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TTL - 2019/12/06 06:04:27.255814 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TTL (6.81s)
=== CONT  TestDNS_ServiceLookup_Truncate
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:27.317482 [WARN] agent: Node name "Node ff4daf28-ba84-1801-6765-41559e590faa" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:27.317956 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:27.320262 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:04:27.617427 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
=== RUN   TestDNS_PreparedQuery_TTL/db.query.consul.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.840101 [DEBUG] dns: request for name db.query.consul. type SRV class IN (took 1.328031ms) from client 127.0.0.1:54690 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/db-ttl.query.consul.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.843476 [DEBUG] dns: request for name db-ttl.query.consul. type SRV class IN (took 1.218695ms) from client 127.0.0.1:41184 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/dblb.query.consul.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.846865 [DEBUG] dns: request for name dblb.query.consul. type SRV class IN (took 950.022µs) from client 127.0.0.1:40015 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/dblb-ttl.query.consul.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.848937 [DEBUG] dns: request for name dblb-ttl.query.consul. type SRV class IN (took 943.688µs) from client 127.0.0.1:39030 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/dk.query.consul.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.852367 [DEBUG] dns: request for name dk.query.consul. type SRV class IN (took 890.021µs) from client 127.0.0.1:47765 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/dk-ttl.query.consul.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.854555 [DEBUG] dns: request for name dk-ttl.query.consul. type SRV class IN (took 982.69µs) from client 127.0.0.1:54110 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/api.query.consul.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.857026 [DEBUG] dns: request for name api.query.consul. type SRV class IN (took 1.16436ms) from client 127.0.0.1:40606 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/api-ttl.query.consul.
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.861051 [DEBUG] dns: request for name api-ttl.query.consul. type SRV class IN (took 2.955735ms) from client 127.0.0.1:59630 (udp)
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.861464 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.861548 [INFO] consul: shutting down server
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:27.861590 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.360615 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.361267 [INFO] manager: shutting down
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.362114 [INFO] agent: consul server down
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.362175 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.362230 [INFO] agent: Stopping DNS server 127.0.0.1:34523 (tcp)
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.362375 [INFO] agent: Stopping DNS server 127.0.0.1:34523 (udp)
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.362518 [INFO] agent: Stopping HTTP server 127.0.0.1:34524 (tcp)
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.362711 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_TTL - 2019/12/06 06:04:28.362778 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_TTL (10.26s)
    --- PASS: TestDNS_PreparedQuery_TTL/db.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/db-ttl.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/dblb.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/dblb-ttl.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/dk.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/dk-ttl.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/api.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/api-ttl.query.consul. (0.00s)
=== CONT  TestBinarySearch
=== RUN   TestBinarySearch/binarySearch_12
=== RUN   TestBinarySearch/binarySearch_256
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:28.499045 [WARN] agent: Node info update blocked by ACLs
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:28.499232 [DEBUG] agent: Node info in sync
=== RUN   TestBinarySearch/binarySearch_512
TestEventFire_token - 2019/12/06 06:04:28.617388 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:352eeab4-3b28-201f-2e74-b33c71d6411b Address:127.0.0.1:34552}]
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:28.635978 [INFO] serf: EventMemberJoin: Node 352eeab4-3b28-201f-2e74-b33c71d6411b.dc1 127.0.0.1
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:28.639798 [INFO] serf: EventMemberJoin: Node 352eeab4-3b28-201f-2e74-b33c71d6411b 127.0.0.1
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:28.641349 [INFO] agent: Started DNS server 127.0.0.1:34547 (udp)
2019/12/06 06:04:28 [INFO]  raft: Node at 127.0.0.1:34552 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:28.654675 [INFO] agent: Started DNS server 127.0.0.1:34547 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:28.648191 [INFO] consul: Handled member-join event for server "Node 352eeab4-3b28-201f-2e74-b33c71d6411b.dc1" in area "wan"
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:28.648868 [INFO] consul: Adding LAN server Node 352eeab4-3b28-201f-2e74-b33c71d6411b (Addr: tcp/127.0.0.1:34552) (DC: dc1)
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:28.660838 [INFO] agent: Started HTTP server on 127.0.0.1:34548 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:28.660953 [INFO] agent: started state syncer
=== RUN   TestBinarySearch/binarySearch_8192
2019/12/06 06:04:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:28 [INFO]  raft: Node at 127.0.0.1:34552 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ff4daf28-ba84-1801-6765-41559e590faa Address:127.0.0.1:34558}]
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:28.815934 [INFO] serf: EventMemberJoin: Node ff4daf28-ba84-1801-6765-41559e590faa.dc1 127.0.0.1
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:28.819436 [INFO] serf: EventMemberJoin: Node ff4daf28-ba84-1801-6765-41559e590faa 127.0.0.1
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:28.820777 [INFO] agent: Started DNS server 127.0.0.1:34553 (udp)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:28.826242 [INFO] consul: Adding LAN server Node ff4daf28-ba84-1801-6765-41559e590faa (Addr: tcp/127.0.0.1:34558) (DC: dc1)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:28.826683 [INFO] consul: Handled member-join event for server "Node ff4daf28-ba84-1801-6765-41559e590faa.dc1" in area "wan"
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:28.827211 [INFO] agent: Started DNS server 127.0.0.1:34553 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:28.830242 [INFO] agent: Started HTTP server on 127.0.0.1:34554 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:28.830348 [INFO] agent: started state syncer
2019/12/06 06:04:28 [INFO]  raft: Node at 127.0.0.1:34558 [Follower] entering Follower state (Leader: "")
=== RUN   TestBinarySearch/binarySearch_65535
2019/12/06 06:04:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:28 [INFO]  raft: Node at 127.0.0.1:34558 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:28.903148 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:28.903601 [DEBUG] consul: Skipping self join check for "Node d7091dd6-564b-5152-4d76-12f3b2b975f9" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:28.903689 [INFO] consul: member 'Node d7091dd6-564b-5152-4d76-12f3b2b975f9' joined, marking health alive
=== RUN   TestBinarySearch/binarySearch_12#01
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.104975 [DEBUG] consul: Skipping self join check for "Node d7091dd6-564b-5152-4d76-12f3b2b975f9" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.105522 [DEBUG] consul: Skipping self join check for "Node d7091dd6-564b-5152-4d76-12f3b2b975f9" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.121647 [DEBUG] consul: dropping node "Node d7091dd6-564b-5152-4d76-12f3b2b975f9" from result due to ACLs
=== RUN   TestBinarySearch/binarySearch_256#01
=== RUN   TestBinarySearch/binarySearch_512#01
=== RUN   TestBinarySearch/binarySearch_8192#01
=== RUN   TestBinarySearch/binarySearch_65535#01
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.356132 [DEBUG] consul: dropping node "foo" from result due to ACLs
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.356560 [DEBUG] dns: request for name foo.service.consul. type A class IN (took 1.041025ms) from client 127.0.0.1:49859 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.356652 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.356735 [INFO] consul: shutting down server
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.356786 [WARN] serf: Shutdown without a Leave
=== CONT  TestDNS_ServiceLookup_Randomize
--- PASS: TestBinarySearch (1.05s)
    --- PASS: TestBinarySearch/binarySearch_12 (0.06s)
    --- PASS: TestBinarySearch/binarySearch_256 (0.08s)
    --- PASS: TestBinarySearch/binarySearch_512 (0.14s)
    --- PASS: TestBinarySearch/binarySearch_8192 (0.15s)
    --- PASS: TestBinarySearch/binarySearch_65535 (0.19s)
    --- PASS: TestBinarySearch/binarySearch_12#01 (0.08s)
    --- PASS: TestBinarySearch/binarySearch_256#01 (0.06s)
    --- PASS: TestBinarySearch/binarySearch_512#01 (0.06s)
    --- PASS: TestBinarySearch/binarySearch_8192#01 (0.06s)
    --- PASS: TestBinarySearch/binarySearch_65535#01 (0.10s)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:29.485675 [WARN] agent: Node name "Node 22414b97-8848-e501-2cee-491e363047f9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:29.486202 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:29.489089 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.535616 [WARN] serf: Shutdown without a Leave
2019/12/06 06:04:29 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:29 [INFO]  raft: Node at 127.0.0.1:34558 [Leader] entering Leader state
2019/12/06 06:04:29 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:29 [INFO]  raft: Node at 127.0.0.1:34552 [Leader] entering Leader state
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:29.539569 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:29.539978 [INFO] consul: New leader elected: Node 352eeab4-3b28-201f-2e74-b33c71d6411b
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:29.540222 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:29.540544 [INFO] consul: New leader elected: Node ff4daf28-ba84-1801-6765-41559e590faa
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.611564 [INFO] manager: shutting down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.612027 [INFO] agent: consul server down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.612072 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.612123 [INFO] agent: Stopping DNS server 127.0.0.1:34541 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.612247 [INFO] agent: Stopping DNS server 127.0.0.1:34541 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.612417 [INFO] agent: Stopping HTTP server 127.0.0.1:34542 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.612590 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/12/06 06:04:29.612656 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_FilterACL (13.13s)
    --- PASS: TestDNS_ServiceLookup_FilterACL/ACLToken_==_root (5.73s)
    --- PASS: TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous (7.40s)
=== CONT  TestDNS_ServiceLookup_OnlyPassing
TestEventFire_token - 2019/12/06 06:04:29.620505 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:29.701668 [WARN] agent: Node name "Node cc8ab5af-5b35-abb5-4643-8034c5594ec7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:29.702115 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:29.704232 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:30.095134 [INFO] agent: Synced node info
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:30.101534 [INFO] agent: Synced node info
TestEventFire_token - 2019/12/06 06:04:30.617028 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:22414b97-8848-e501-2cee-491e363047f9 Address:127.0.0.1:34564}]
2019/12/06 06:04:31 [INFO]  raft: Node at 127.0.0.1:34564 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.083670 [INFO] serf: EventMemberJoin: Node 22414b97-8848-e501-2cee-491e363047f9.dc1 127.0.0.1
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.088139 [INFO] serf: EventMemberJoin: Node 22414b97-8848-e501-2cee-491e363047f9 127.0.0.1
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.089167 [INFO] consul: Adding LAN server Node 22414b97-8848-e501-2cee-491e363047f9 (Addr: tcp/127.0.0.1:34564) (DC: dc1)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.089969 [INFO] consul: Handled member-join event for server "Node 22414b97-8848-e501-2cee-491e363047f9.dc1" in area "wan"
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.090443 [INFO] agent: Started DNS server 127.0.0.1:34559 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.090551 [INFO] agent: Started DNS server 127.0.0.1:34559 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.093359 [INFO] agent: Started HTTP server on 127.0.0.1:34560 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.093456 [INFO] agent: started state syncer
2019/12/06 06:04:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:31 [INFO]  raft: Node at 127.0.0.1:34564 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cc8ab5af-5b35-abb5-4643-8034c5594ec7 Address:127.0.0.1:34570}]
2019/12/06 06:04:31 [INFO]  raft: Node at 127.0.0.1:34570 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:31.216982 [INFO] serf: EventMemberJoin: Node cc8ab5af-5b35-abb5-4643-8034c5594ec7.dc1 127.0.0.1
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:31.221366 [INFO] serf: EventMemberJoin: Node cc8ab5af-5b35-abb5-4643-8034c5594ec7 127.0.0.1
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:31.222350 [INFO] consul: Adding LAN server Node cc8ab5af-5b35-abb5-4643-8034c5594ec7 (Addr: tcp/127.0.0.1:34570) (DC: dc1)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:31.222689 [INFO] consul: Handled member-join event for server "Node cc8ab5af-5b35-abb5-4643-8034c5594ec7.dc1" in area "wan"
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:31.223145 [INFO] agent: Started DNS server 127.0.0.1:34565 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:31.223206 [INFO] agent: Started DNS server 127.0.0.1:34565 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:31.226267 [INFO] agent: Started HTTP server on 127.0.0.1:34566 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:31.226378 [INFO] agent: started state syncer
2019/12/06 06:04:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:31 [INFO]  raft: Node at 127.0.0.1:34570 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:04:31.617393 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:31.876247 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:31.876401 [DEBUG] agent: Node info in sync
2019/12/06 06:04:31 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:31 [INFO]  raft: Node at 127.0.0.1:34564 [Leader] entering Leader state
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.913790 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:31.914338 [INFO] consul: New leader elected: Node 22414b97-8848-e501-2cee-491e363047f9
2019/12/06 06:04:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:32 [INFO]  raft: Node at 127.0.0.1:34570 [Leader] entering Leader state
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:32.037628 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:32.038185 [INFO] consul: New leader elected: Node cc8ab5af-5b35-abb5-4643-8034c5594ec7
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:32.428580 [INFO] agent: Synced node info
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:32.565800 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:04:32.617152 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:32.752963 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:32.771094 [DEBUG] dns: request for name _this-is-a-very-very-very-very-very-long-name-for-a-service._master.service.consul. type SRV class IN (took 1.373365ms) from client 127.0.0.1:34004 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:32.783696 [DEBUG] dns: request for name this-is-a-very-very-very-very-very-long-name-for-a-service.query.consul. type SRV class IN (took 3.859423ms) from client 127.0.0.1:60669 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:32.793689 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:32.793784 [INFO] consul: shutting down server
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:32.793831 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:32.794887 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:32.795001 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:04:33.193617 [DEBUG] manager: Rebalanced 1 servers, next active server is Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2.dc1 (Addr: tcp/127.0.0.1:34192) (DC: dc1)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:33.253385 [INFO] agent: Synced node info
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.259343 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:33.328311 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:33.328458 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.445062 [INFO] manager: shutting down
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.446281 [INFO] agent: consul server down
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.446341 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.446393 [INFO] agent: Stopping DNS server 127.0.0.1:34547 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.446551 [INFO] agent: Stopping DNS server 127.0.0.1:34547 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.446708 [INFO] agent: Stopping HTTP server 127.0.0.1:34548 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.446914 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.446997 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_LargeResponses (6.46s)
=== CONT  TestDNS_ServiceLookup_OnlyFailing
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.449843 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.449936 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.450122 [WARN] consul: error getting server health from "Node 352eeab4-3b28-201f-2e74-b33c71d6411b": rpc error making call: EOF
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:33.522871 [WARN] agent: Node name "Node 8317341f-085d-d821-3a84-0931b734325f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:33.523431 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:33.525788 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:04:33.619539 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:33.620858 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_LargeResponses - 2019/12/06 06:04:33.752875 [WARN] consul: error getting server health from "Node 352eeab4-3b28-201f-2e74-b33c71d6411b": context deadline exceeded
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:33.964569 [DEBUG] consul: Skipping self join check for "Node ff4daf28-ba84-1801-6765-41559e590faa" since the cluster is too small
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:33.964778 [INFO] consul: member 'Node ff4daf28-ba84-1801-6765-41559e590faa' joined, marking health alive
jones - 2019/12/06 06:04:34.444702 [DEBUG] consul: Skipping self join check for "Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2" since the cluster is too small
TestEventFire_token - 2019/12/06 06:04:34.618976 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:34.887046 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:34.887176 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:35.159813 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8317341f-085d-d821-3a84-0931b734325f Address:127.0.0.1:34576}]
2019/12/06 06:04:35 [INFO]  raft: Node at 127.0.0.1:34576 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:35.349836 [INFO] serf: EventMemberJoin: Node 8317341f-085d-d821-3a84-0931b734325f.dc1 127.0.0.1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:35.354085 [INFO] serf: EventMemberJoin: Node 8317341f-085d-d821-3a84-0931b734325f 127.0.0.1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:35.363103 [INFO] consul: Adding LAN server Node 8317341f-085d-d821-3a84-0931b734325f (Addr: tcp/127.0.0.1:34576) (DC: dc1)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:35.364072 [INFO] consul: Handled member-join event for server "Node 8317341f-085d-d821-3a84-0931b734325f.dc1" in area "wan"
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:35.364425 [INFO] agent: Started DNS server 127.0.0.1:34571 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:35.364945 [INFO] agent: Started DNS server 127.0.0.1:34571 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:35.374642 [INFO] agent: Started HTTP server on 127.0.0.1:34572 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:35.375177 [INFO] agent: started state syncer
2019/12/06 06:04:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:35 [INFO]  raft: Node at 127.0.0.1:34576 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:04:35.617428 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:04:35.800300 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 69e40561-243a-545f-340c-f7edd80028d7.dc1 (Addr: tcp/127.0.0.1:34198) (DC: dc1)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:36.390231 [INFO] connect: initialized primary datacenter CA with provider "consul"
2019/12/06 06:04:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:36 [INFO]  raft: Node at 127.0.0.1:34576 [Leader] entering Leader state
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.496781 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:36.497537 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:36.497935 [INFO] consul: New leader elected: Node 8317341f-085d-d821-3a84-0931b734325f
TestEventFire_token - 2019/12/06 06:04:36.617446 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:36.719300 [DEBUG] consul: Skipping self join check for "Node 22414b97-8848-e501-2cee-491e363047f9" since the cluster is too small
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:36.719542 [INFO] consul: member 'Node 22414b97-8848-e501-2cee-491e363047f9' joined, marking health alive
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.892702 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 869.353µs) from client 127.0.0.1:55089 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.897284 [DEBUG] tlsutil: Update with version 2
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.898212 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.898322 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.900285 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 1.679706ms) from client 127.0.0.1:48306 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.900793 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.900872 [INFO] consul: shutting down server
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.900916 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:36.902655 [DEBUG] dns: request for name 30dfd845-e334-8a9b-58d2-1e7bee21232d.query.consul. type ANY class IN (took 6.128808ms) from client 127.0.0.1:56135 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.010785 [WARN] serf: Shutdown without a Leave
jones - 2019/12/06 06:04:37.144634 [DEBUG] consul: Skipping self join check for "Node 69e40561-243a-545f-340c-f7edd80028d7" since the cluster is too small
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:37.145935 [INFO] agent: Synced node info
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:37.146058 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.147875 [INFO] manager: shutting down
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.148744 [INFO] agent: consul server down
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.148805 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.148865 [INFO] agent: Stopping DNS server 127.0.0.1:34565 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.149009 [INFO] agent: Stopping DNS server 127.0.0.1:34565 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.149152 [INFO] agent: Stopping HTTP server 127.0.0.1:34566 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.149411 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.149487 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_OnlyPassing (7.54s)
=== CONT  TestDNS_ServiceLookup_FilterCritical
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.150159 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_ServiceLookup_OnlyPassing - 2019/12/06 06:04:37.150218 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:37.216484 [WARN] agent: Node name "Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:37.217009 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:37.219712 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:04:37.617191 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:37.799219 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:04:38.617198 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:24d9a103-6e66-4f4e-0a08-1b964e1e1d97 Address:127.0.0.1:34582}]
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:38.823131 [INFO] serf: EventMemberJoin: Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97.dc1 127.0.0.1
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:38.826415 [INFO] serf: EventMemberJoin: Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97 127.0.0.1
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:38.827643 [INFO] agent: Started DNS server 127.0.0.1:34577 (udp)
2019/12/06 06:04:38 [INFO]  raft: Node at 127.0.0.1:34582 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:38.830070 [INFO] consul: Handled member-join event for server "Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97.dc1" in area "wan"
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:38.830185 [INFO] consul: Adding LAN server Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97 (Addr: tcp/127.0.0.1:34582) (DC: dc1)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:38.830605 [INFO] agent: Started DNS server 127.0.0.1:34577 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:38.832898 [INFO] agent: Started HTTP server on 127.0.0.1:34578 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:38.833006 [INFO] agent: started state syncer
2019/12/06 06:04:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:38 [INFO]  raft: Node at 127.0.0.1:34582 [Candidate] entering Candidate state in term 2
jones - 2019/12/06 06:04:39.570667 [DEBUG] manager: Rebalanced 1 servers, next active server is Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141.dc1 (Addr: tcp/127.0.0.1:34204) (DC: dc1)
TestEventFire_token - 2019/12/06 06:04:39.617387 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:39.628060 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:39 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:39 [INFO]  raft: Node at 127.0.0.1:34582 [Leader] entering Leader state
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:39.757449 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:39.757950 [INFO] consul: New leader elected: Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:40.611667 [INFO] agent: Synced node info
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:40.611779 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:04:40.613169 [DEBUG] consul: Skipping self join check for "Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141" since the cluster is too small
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.616650 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.617100 [DEBUG] consul: Skipping self join check for "Node 8317341f-085d-d821-3a84-0931b734325f" since the cluster is too small
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.617257 [INFO] consul: member 'Node 8317341f-085d-d821-3a84-0931b734325f' joined, marking health alive
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.620364 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 1.28203ms) from client 127.0.0.1:48777 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.624458 [DEBUG] dns: request for name d9acb90f-75f3-6235-217c-be0912c062f8.query.consul. type ANY class IN (took 2.03938ms) from client 127.0.0.1:60256 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.624831 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.624945 [INFO] consul: shutting down server
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.625000 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:04:40.626133 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.852536 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.957309 [INFO] manager: shutting down
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.958069 [INFO] agent: consul server down
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.958131 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.958189 [INFO] agent: Stopping DNS server 127.0.0.1:34571 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.958428 [INFO] agent: Stopping DNS server 127.0.0.1:34571 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.958634 [INFO] agent: Stopping HTTP server 127.0.0.1:34572 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.958893 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_OnlyFailing - 2019/12/06 06:04:40.959254 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_OnlyFailing (7.51s)
=== CONT  TestDNS_RecursorTimeout
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_RecursorTimeout - 2019/12/06 06:04:41.039385 [WARN] agent: Node name "Node 1bde78f5-bbbb-be9a-3b5a-a26fa86fea52" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_RecursorTimeout - 2019/12/06 06:04:41.039829 [DEBUG] tlsutil: Update with version 1
TestDNS_RecursorTimeout - 2019/12/06 06:04:41.042030 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:41.241270 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:04:41.486408 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 7d555e7b-b226-ede2-fc13-7639f5dd2636.dc1 (Addr: tcp/127.0.0.1:34210) (DC: dc1)
TestEventFire_token - 2019/12/06 06:04:41.617571 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:04:42.369854 [DEBUG] consul: Skipping self join check for "Node 7d555e7b-b226-ede2-fc13-7639f5dd2636" since the cluster is too small
jones - 2019/12/06 06:04:42.477212 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:04:42.477299 [DEBUG] agent: Node info in sync
2019/12/06 06:04:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1bde78f5-bbbb-be9a-3b5a-a26fa86fea52 Address:127.0.0.1:34588}]
2019/12/06 06:04:42 [INFO]  raft: Node at 127.0.0.1:34588 [Follower] entering Follower state (Leader: "")
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.599786 [INFO] serf: EventMemberJoin: Node 1bde78f5-bbbb-be9a-3b5a-a26fa86fea52.dc1 127.0.0.1
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.606237 [INFO] serf: EventMemberJoin: Node 1bde78f5-bbbb-be9a-3b5a-a26fa86fea52 127.0.0.1
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.607364 [INFO] consul: Handled member-join event for server "Node 1bde78f5-bbbb-be9a-3b5a-a26fa86fea52.dc1" in area "wan"
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.607970 [INFO] consul: Adding LAN server Node 1bde78f5-bbbb-be9a-3b5a-a26fa86fea52 (Addr: tcp/127.0.0.1:34588) (DC: dc1)
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.608053 [DEBUG] dns: recursor enabled
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.608333 [DEBUG] dns: recursor enabled
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.608524 [INFO] agent: Started DNS server 127.0.0.1:34583 (udp)
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.608974 [INFO] agent: Started DNS server 127.0.0.1:34583 (tcp)
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.612146 [INFO] agent: Started HTTP server on 127.0.0.1:34584 (tcp)
TestDNS_RecursorTimeout - 2019/12/06 06:04:42.612406 [INFO] agent: started state syncer
TestEventFire_token - 2019/12/06 06:04:42.617436 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:42 [INFO]  raft: Node at 127.0.0.1:34588 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:43 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:43 [INFO]  raft: Node at 127.0.0.1:34588 [Leader] entering Leader state
TestDNS_RecursorTimeout - 2019/12/06 06:04:43.248886 [INFO] consul: cluster leadership acquired
TestDNS_RecursorTimeout - 2019/12/06 06:04:43.249435 [INFO] consul: New leader elected: Node 1bde78f5-bbbb-be9a-3b5a-a26fa86fea52
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:43.399551 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:04:43.617163 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:04:43.944088 [DEBUG] manager: Rebalanced 1 servers, next active server is Node a587e71c-195b-52ca-e2b1-0bac5467c444.dc1 (Addr: tcp/127.0.0.1:34216) (DC: dc1)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.399669 [WARN] consul: error getting server health from "Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97": context deadline exceeded
TestEventFire_token - 2019/12/06 06:04:44.617258 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_RecursorTimeout - 2019/12/06 06:04:44.853367 [INFO] agent: Synced node info
TestDNS_RecursorTimeout - 2019/12/06 06:04:44.853485 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.858254 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.858667 [DEBUG] consul: Skipping self join check for "Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97" since the cluster is too small
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.858821 [INFO] consul: member 'Node 24d9a103-6e66-4f4e-0a08-1b964e1e1d97' joined, marking health alive
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.869125 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 985.023µs) from client 127.0.0.1:56094 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.871583 [DEBUG] dns: request for name b1d80e53-a8b0-7215-fa05-14d97a063a8c.query.consul. type ANY class IN (took 1.035357ms) from client 127.0.0.1:38638 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.874303 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.874648 [INFO] consul: shutting down server
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.874850 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:44.939436 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:45.054966 [INFO] manager: shutting down
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:45.057916 [INFO] agent: consul server down
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:45.057987 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:45.058045 [INFO] agent: Stopping DNS server 127.0.0.1:34577 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:45.058201 [INFO] agent: Stopping DNS server 127.0.0.1:34577 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:45.058374 [INFO] agent: Stopping HTTP server 127.0.0.1:34578 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:45.058602 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_FilterCritical - 2019/12/06 06:04:45.058676 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_FilterCritical (7.91s)
=== CONT  TestDNS_Recurse_Truncation
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Recurse_Truncation - 2019/12/06 06:04:45.122390 [WARN] agent: Node name "Node a0c1eddc-8f1d-6e89-62b5-f2ab548327d2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Recurse_Truncation - 2019/12/06 06:04:45.122862 [DEBUG] tlsutil: Update with version 1
TestDNS_Recurse_Truncation - 2019/12/06 06:04:45.125097 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:04:45.136613 [DEBUG] consul: Skipping self join check for "Node a587e71c-195b-52ca-e2b1-0bac5467c444" since the cluster is too small
TestEventFire_token - 2019/12/06 06:04:45.617288 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_RecursorTimeout - 2019/12/06 06:04:46.150754 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_RecursorTimeout - 2019/12/06 06:04:46.151175 [DEBUG] consul: Skipping self join check for "Node 1bde78f5-bbbb-be9a-3b5a-a26fa86fea52" since the cluster is too small
TestDNS_RecursorTimeout - 2019/12/06 06:04:46.151328 [INFO] consul: member 'Node 1bde78f5-bbbb-be9a-3b5a-a26fa86fea52' joined, marking health alive
jones - 2019/12/06 06:04:46.200535 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:04:46.200765 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/12/06 06:04:46.200881 [DEBUG] agent: Node info in sync
2019/12/06 06:04:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a0c1eddc-8f1d-6e89-62b5-f2ab548327d2 Address:127.0.0.1:34594}]
2019/12/06 06:04:46 [INFO]  raft: Node at 127.0.0.1:34594 [Follower] entering Follower state (Leader: "")
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.423413 [INFO] serf: EventMemberJoin: Node a0c1eddc-8f1d-6e89-62b5-f2ab548327d2.dc1 127.0.0.1
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.426652 [INFO] serf: EventMemberJoin: Node a0c1eddc-8f1d-6e89-62b5-f2ab548327d2 127.0.0.1
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.427177 [INFO] consul: Handled member-join event for server "Node a0c1eddc-8f1d-6e89-62b5-f2ab548327d2.dc1" in area "wan"
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.427462 [INFO] consul: Adding LAN server Node a0c1eddc-8f1d-6e89-62b5-f2ab548327d2 (Addr: tcp/127.0.0.1:34594) (DC: dc1)
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.427560 [DEBUG] dns: recursor enabled
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.427560 [DEBUG] dns: recursor enabled
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.427893 [INFO] agent: Started DNS server 127.0.0.1:34589 (tcp)
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.428294 [INFO] agent: Started DNS server 127.0.0.1:34589 (udp)
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.430598 [INFO] agent: Started HTTP server on 127.0.0.1:34590 (tcp)
TestDNS_Recurse_Truncation - 2019/12/06 06:04:46.430703 [INFO] agent: started state syncer
2019/12/06 06:04:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:46 [INFO]  raft: Node at 127.0.0.1:34594 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:04:46.617214 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:04:46.645313 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4.dc1 (Addr: tcp/127.0.0.1:34222) (DC: dc1)
TestDNS_RecursorTimeout - 2019/12/06 06:04:46.818222 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDNS_RecursorTimeout - 2019/12/06 06:04:46.818314 [DEBUG] agent: Node info in sync
TestDNS_RecursorTimeout - 2019/12/06 06:04:47.070223 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:47 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:47 [INFO]  raft: Node at 127.0.0.1:34594 [Leader] entering Leader state
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.182707 [INFO] consul: cluster leadership acquired
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.183076 [INFO] consul: New leader elected: Node a0c1eddc-8f1d-6e89-62b5-f2ab548327d2
jones - 2019/12/06 06:04:47.453647 [DEBUG] consul: Skipping self join check for "Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4" since the cluster is too small
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.545400 [INFO] agent: Synced node info
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.545520 [DEBUG] agent: Node info in sync
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.571219 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (6.833491ms) Recursor queried: 127.0.0.1:34146
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.571510 [DEBUG] dns: request for {apple.com. 255 1} (udp) (7.699512ms) from client 127.0.0.1:39819 (udp)
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.571685 [INFO] agent: Requesting shutdown
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.571749 [INFO] consul: shutting down server
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.571797 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:04:47.618177 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.658529 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.754628 [INFO] manager: shutting down
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.757900 [INFO] agent: consul server down
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.757975 [INFO] agent: shutdown complete
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.758045 [INFO] agent: Stopping DNS server 127.0.0.1:34589 (tcp)
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.758225 [INFO] agent: Stopping DNS server 127.0.0.1:34589 (udp)
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.758392 [INFO] agent: Stopping HTTP server 127.0.0.1:34590 (tcp)
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.758664 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.758751 [INFO] agent: Endpoints down
--- PASS: TestDNS_Recurse_Truncation (2.70s)
=== CONT  TestDNS_Recurse
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.760132 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_Recurse_Truncation - 2019/12/06 06:04:47.760489 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Recurse - 2019/12/06 06:04:47.823237 [WARN] agent: Node name "Node a3a3174f-7783-9695-41aa-87767727b372" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Recurse - 2019/12/06 06:04:47.823645 [DEBUG] tlsutil: Update with version 1
TestDNS_Recurse - 2019/12/06 06:04:47.826485 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_RecursorTimeout - 2019/12/06 06:04:47.866833 [ERR] dns: recurse failed: read udp 127.0.0.1:43919->127.0.0.1:51737: i/o timeout
TestDNS_RecursorTimeout - 2019/12/06 06:04:47.867104 [ERR] dns: all resolvers failed for {apple.com. 255 1} from client 127.0.0.1:37938 (udp)
TestDNS_RecursorTimeout - 2019/12/06 06:04:47.867616 [INFO] agent: Requesting shutdown
TestDNS_RecursorTimeout - 2019/12/06 06:04:47.867694 [INFO] consul: shutting down server
TestDNS_RecursorTimeout - 2019/12/06 06:04:47.867742 [WARN] serf: Shutdown without a Leave
TestDNS_RecursorTimeout - 2019/12/06 06:04:47.868718 [DEBUG] dns: request for {apple.com. 255 1} (udp) (3.002608231s) from client 127.0.0.1:37938 (udp)
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.027621 [WARN] serf: Shutdown without a Leave
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.119352 [INFO] manager: shutting down
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.120048 [INFO] agent: consul server down
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.120104 [INFO] agent: shutdown complete
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.120166 [INFO] agent: Stopping DNS server 127.0.0.1:34583 (tcp)
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.120310 [INFO] agent: Stopping DNS server 127.0.0.1:34583 (udp)
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.120470 [INFO] agent: Stopping HTTP server 127.0.0.1:34584 (tcp)
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.120665 [INFO] agent: Waiting for endpoints to shut down
TestDNS_RecursorTimeout - 2019/12/06 06:04:48.120734 [INFO] agent: Endpoints down
--- PASS: TestDNS_RecursorTimeout (7.16s)
=== CONT  TestDNS_ServiceLookup_Dedup_SRV
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:48.174867 [WARN] agent: Node name "Node 6cee0cfe-e034-446d-bcd5-c77e223f3206" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:48.175298 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:48.177341 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:04:48.617186 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a3a3174f-7783-9695-41aa-87767727b372 Address:127.0.0.1:34600}]
2019/12/06 06:04:48 [INFO]  raft: Node at 127.0.0.1:34600 [Follower] entering Follower state (Leader: "")
TestDNS_Recurse - 2019/12/06 06:04:48.844482 [INFO] serf: EventMemberJoin: Node a3a3174f-7783-9695-41aa-87767727b372.dc1 127.0.0.1
TestDNS_Recurse - 2019/12/06 06:04:48.860452 [INFO] serf: EventMemberJoin: Node a3a3174f-7783-9695-41aa-87767727b372 127.0.0.1
TestDNS_Recurse - 2019/12/06 06:04:48.861354 [INFO] consul: Adding LAN server Node a3a3174f-7783-9695-41aa-87767727b372 (Addr: tcp/127.0.0.1:34600) (DC: dc1)
TestDNS_Recurse - 2019/12/06 06:04:48.861414 [DEBUG] dns: recursor enabled
TestDNS_Recurse - 2019/12/06 06:04:48.861687 [INFO] consul: Handled member-join event for server "Node a3a3174f-7783-9695-41aa-87767727b372.dc1" in area "wan"
TestDNS_Recurse - 2019/12/06 06:04:48.861916 [INFO] agent: Started DNS server 127.0.0.1:34595 (udp)
TestDNS_Recurse - 2019/12/06 06:04:48.861995 [DEBUG] dns: recursor enabled
TestDNS_Recurse - 2019/12/06 06:04:48.862275 [INFO] agent: Started DNS server 127.0.0.1:34595 (tcp)
TestDNS_Recurse - 2019/12/06 06:04:48.864685 [INFO] agent: Started HTTP server on 127.0.0.1:34596 (tcp)
TestDNS_Recurse - 2019/12/06 06:04:48.864782 [INFO] agent: started state syncer
2019/12/06 06:04:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:48 [INFO]  raft: Node at 127.0.0.1:34600 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6cee0cfe-e034-446d-bcd5-c77e223f3206 Address:127.0.0.1:34606}]
2019/12/06 06:04:49 [INFO]  raft: Node at 127.0.0.1:34606 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.265865 [INFO] serf: EventMemberJoin: Node 6cee0cfe-e034-446d-bcd5-c77e223f3206.dc1 127.0.0.1
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.269017 [INFO] serf: EventMemberJoin: Node 6cee0cfe-e034-446d-bcd5-c77e223f3206 127.0.0.1
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.270574 [INFO] consul: Adding LAN server Node 6cee0cfe-e034-446d-bcd5-c77e223f3206 (Addr: tcp/127.0.0.1:34606) (DC: dc1)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.270800 [INFO] consul: Handled member-join event for server "Node 6cee0cfe-e034-446d-bcd5-c77e223f3206.dc1" in area "wan"
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.271219 [INFO] agent: Started DNS server 127.0.0.1:34601 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.271386 [INFO] agent: Started DNS server 127.0.0.1:34601 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.273834 [INFO] agent: Started HTTP server on 127.0.0.1:34602 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.273940 [INFO] agent: started state syncer
2019/12/06 06:04:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:49 [INFO]  raft: Node at 127.0.0.1:34606 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:49 [INFO]  raft: Node at 127.0.0.1:34600 [Leader] entering Leader state
TestDNS_Recurse - 2019/12/06 06:04:49.505284 [INFO] consul: cluster leadership acquired
TestDNS_Recurse - 2019/12/06 06:04:49.505769 [INFO] consul: New leader elected: Node a3a3174f-7783-9695-41aa-87767727b372
TestEventFire_token - 2019/12/06 06:04:49.617240 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:49 [INFO]  raft: Node at 127.0.0.1:34606 [Leader] entering Leader state
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.854660 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:49.855041 [INFO] consul: New leader elected: Node 6cee0cfe-e034-446d-bcd5-c77e223f3206
jones - 2019/12/06 06:04:50.154467 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e.dc1 (Addr: tcp/127.0.0.1:34228) (DC: dc1)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:50.186848 [INFO] agent: Synced node info
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:50.186989 [DEBUG] agent: Node info in sync
TestDNS_Recurse - 2019/12/06 06:04:50.190683 [INFO] agent: Synced node info
TestDNS_Recurse - 2019/12/06 06:04:50.190805 [DEBUG] agent: Node info in sync
TestDNS_Recurse - 2019/12/06 06:04:50.201170 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (507.678µs) Recursor queried: 127.0.0.1:38545
TestDNS_Recurse - 2019/12/06 06:04:50.201437 [DEBUG] dns: request for {apple.com. 255 1} (udp) (1.242695ms) from client 127.0.0.1:39402 (udp)
TestDNS_Recurse - 2019/12/06 06:04:50.201624 [INFO] agent: Requesting shutdown
TestDNS_Recurse - 2019/12/06 06:04:50.201706 [INFO] consul: shutting down server
TestDNS_Recurse - 2019/12/06 06:04:50.201762 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse - 2019/12/06 06:04:50.374340 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse - 2019/12/06 06:04:50.478672 [INFO] manager: shutting down
TestDNS_Recurse - 2019/12/06 06:04:50.479650 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_Recurse - 2019/12/06 06:04:50.479747 [INFO] agent: consul server down
TestDNS_Recurse - 2019/12/06 06:04:50.479791 [INFO] agent: shutdown complete
TestDNS_Recurse - 2019/12/06 06:04:50.479841 [INFO] agent: Stopping DNS server 127.0.0.1:34595 (tcp)
TestDNS_Recurse - 2019/12/06 06:04:50.479979 [INFO] agent: Stopping DNS server 127.0.0.1:34595 (udp)
TestDNS_Recurse - 2019/12/06 06:04:50.480141 [INFO] agent: Stopping HTTP server 127.0.0.1:34596 (tcp)
TestDNS_Recurse - 2019/12/06 06:04:50.480335 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Recurse - 2019/12/06 06:04:50.480406 [INFO] agent: Endpoints down
--- PASS: TestDNS_Recurse (2.72s)
=== CONT  TestDNS_ServiceLookup_PreparedQueryNamePeriod
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:50.549132 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:50.601206 [WARN] agent: Node name "Node 2a79edf6-ba54-3bad-8dae-2ea7843f4ef4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:50.601853 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:50.605064 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:04:50.617533 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:04:50.903546 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:04:50.903636 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:04:51.191129 [DEBUG] consul: Skipping self join check for "Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e" since the cluster is too small
2019/12/06 06:04:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2a79edf6-ba54-3bad-8dae-2ea7843f4ef4 Address:127.0.0.1:34612}]
2019/12/06 06:04:51 [INFO]  raft: Node at 127.0.0.1:34612 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:51.588363 [INFO] serf: EventMemberJoin: Node 2a79edf6-ba54-3bad-8dae-2ea7843f4ef4.dc1 127.0.0.1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:51.598988 [INFO] serf: EventMemberJoin: Node 2a79edf6-ba54-3bad-8dae-2ea7843f4ef4 127.0.0.1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:51.602287 [INFO] consul: Adding LAN server Node 2a79edf6-ba54-3bad-8dae-2ea7843f4ef4 (Addr: tcp/127.0.0.1:34612) (DC: dc1)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:51.603141 [INFO] consul: Handled member-join event for server "Node 2a79edf6-ba54-3bad-8dae-2ea7843f4ef4.dc1" in area "wan"
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:51.604566 [INFO] agent: Started DNS server 127.0.0.1:34607 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:51.604671 [INFO] agent: Started DNS server 127.0.0.1:34607 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:51.608591 [INFO] agent: Started HTTP server on 127.0.0.1:34608 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:51.608733 [INFO] agent: started state syncer
TestEventFire_token - 2019/12/06 06:04:51.617706 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:51 [INFO]  raft: Node at 127.0.0.1:34612 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.666728 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.13636ms) from client 127.0.0.1:59727 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.669258 [DEBUG] dns: request for name a6eb1503-8f18-d56d-0cab-9489849db203.query.consul. type SRV class IN (took 1.366699ms) from client 127.0.0.1:58651 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.669751 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.669833 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.669881 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.811128 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.886109 [INFO] manager: shutting down
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.889625 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.889828 [INFO] agent: consul server down
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.889882 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.889941 [INFO] agent: Stopping DNS server 127.0.0.1:34601 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.890085 [INFO] agent: Stopping DNS server 127.0.0.1:34601 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.890260 [INFO] agent: Stopping HTTP server 127.0.0.1:34602 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.890486 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Dedup_SRV - 2019/12/06 06:04:51.890553 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Dedup_SRV (3.77s)
=== CONT  TestDNS_PreparedQueryNearIP
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.026315 [WARN] agent: Node name "Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.027080 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.031148 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:52 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:52 [INFO]  raft: Node at 127.0.0.1:34612 [Leader] entering Leader state
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:52.134334 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:52.134793 [INFO] consul: New leader elected: Node 2a79edf6-ba54-3bad-8dae-2ea7843f4ef4
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:52.453575 [INFO] agent: Synced node info
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:52.453733 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:04:52.617210 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:04:52.620023 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 05379951-a361-1a11-bcaf-fc25c0b4fc85.dc1 (Addr: tcp/127.0.0.1:34234) (DC: dc1)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:52.666122 [DEBUG] agent: Node info in sync
2019/12/06 06:04:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e Address:127.0.0.1:34618}]
2019/12/06 06:04:52 [INFO]  raft: Node at 127.0.0.1:34618 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.891306 [INFO] serf: EventMemberJoin: Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e.dc1 127.0.0.1
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.894818 [INFO] serf: EventMemberJoin: Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e 127.0.0.1
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.896344 [INFO] consul: Adding LAN server Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e (Addr: tcp/127.0.0.1:34618) (DC: dc1)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.896425 [INFO] agent: Started DNS server 127.0.0.1:34613 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.896621 [INFO] consul: Handled member-join event for server "Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e.dc1" in area "wan"
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.896821 [INFO] agent: Started DNS server 127.0.0.1:34613 (tcp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.908186 [INFO] agent: Started HTTP server on 127.0.0.1:34614 (tcp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:52.911125 [INFO] agent: started state syncer
2019/12/06 06:04:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:52 [INFO]  raft: Node at 127.0.0.1:34618 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.277349 [DEBUG] dns: request for name some.query.we.like.query.consul. type SRV class IN (took 948.355µs) from client 127.0.0.1:46823 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.277734 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.277823 [INFO] consul: shutting down server
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.277879 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.345776 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.419539 [INFO] manager: shutting down
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.424555 [INFO] agent: consul server down
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.424625 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.424688 [INFO] agent: Stopping DNS server 127.0.0.1:34607 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.424846 [INFO] agent: Stopping DNS server 127.0.0.1:34607 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.425011 [INFO] agent: Stopping HTTP server 127.0.0.1:34608 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.425214 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.425280 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_PreparedQueryNamePeriod (2.94s)
=== CONT  TestDNS_PreparedQueryNearIPEDNS
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.430621 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/12/06 06:04:53.430848 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:53.507335 [WARN] agent: Node name "Node 59afbec1-cede-c10a-7b90-761723dd3612" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:53.507971 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:53.511089 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:53 [INFO]  raft: Node at 127.0.0.1:34618 [Leader] entering Leader state
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:53.612932 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:53.613456 [INFO] consul: New leader elected: Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e
TestEventFire_token - 2019/12/06 06:04:53.617291 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:04:53.690734 [DEBUG] consul: Skipping self join check for "Node 05379951-a361-1a11-bcaf-fc25c0b4fc85" since the cluster is too small
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:54.057824 [INFO] agent: Synced node info
2019/12/06 06:04:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:59afbec1-cede-c10a-7b90-761723dd3612 Address:127.0.0.1:34624}]
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:54.473020 [INFO] serf: EventMemberJoin: Node 59afbec1-cede-c10a-7b90-761723dd3612.dc1 127.0.0.1
2019/12/06 06:04:54 [INFO]  raft: Node at 127.0.0.1:34624 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:54.476415 [INFO] serf: EventMemberJoin: Node 59afbec1-cede-c10a-7b90-761723dd3612 127.0.0.1
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:54.490554 [INFO] agent: Started DNS server 127.0.0.1:34619 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:54.490853 [INFO] consul: Adding LAN server Node 59afbec1-cede-c10a-7b90-761723dd3612 (Addr: tcp/127.0.0.1:34624) (DC: dc1)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:54.491087 [INFO] consul: Handled member-join event for server "Node 59afbec1-cede-c10a-7b90-761723dd3612.dc1" in area "wan"
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:54.491234 [INFO] agent: Started DNS server 127.0.0.1:34619 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:54.493611 [INFO] agent: Started HTTP server on 127.0.0.1:34620 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:54.493727 [INFO] agent: started state syncer
2019/12/06 06:04:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:54 [INFO]  raft: Node at 127.0.0.1:34624 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:04:54.617367 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:55 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:55 [INFO]  raft: Node at 127.0.0.1:34624 [Leader] entering Leader state
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:55.053272 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:55.053717 [INFO] consul: New leader elected: Node 59afbec1-cede-c10a-7b90-761723dd3612
jones - 2019/12/06 06:04:55.099826 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 48766921-a98a-d876-447b-181b21701746.dc1 (Addr: tcp/127.0.0.1:34240) (DC: dc1)
Added 3 service nodes
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:55.587875 [INFO] agent: Synced node info
TestEventFire_token - 2019/12/06 06:04:55.617372 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:55.980495 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:56.197397 [DEBUG] agent: Node info in sync
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:56.197528 [DEBUG] agent: Node info in sync
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.212112 [DEBUG] consul: Skipping self join check for "Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e" since the cluster is too small
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.212307 [INFO] consul: member 'Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e' joined, marking health alive
jones - 2019/12/06 06:04:56.215903 [DEBUG] consul: Skipping self join check for "Node 48766921-a98a-d876-447b-181b21701746" since the cluster is too small
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.219156 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 3.344744ms) from client 127.0.0.1:59420 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.246716 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.099358ms) from client 127.0.0.1:36779 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.274317 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.124026ms) from client 127.0.0.1:58452 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.301832 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.217029ms) from client 127.0.0.1:53952 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.330221 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.314697ms) from client 127.0.0.1:52047 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.358091 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.115359ms) from client 127.0.0.1:54196 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.385770 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.072358ms) from client 127.0.0.1:44444 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.386114 [INFO] agent: Requesting shutdown
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.386225 [INFO] consul: shutting down server
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.386272 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.415263 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.421250 [WARN] consul: error getting server health from "Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e": rpc error making call: EOF
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.525728 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:04:56.617237 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.644651 [INFO] manager: shutting down
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.645223 [INFO] agent: consul server down
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.645284 [INFO] agent: shutdown complete
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.645353 [INFO] agent: Stopping DNS server 127.0.0.1:34613 (tcp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.645523 [INFO] agent: Stopping DNS server 127.0.0.1:34613 (udp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.645814 [INFO] agent: Stopping HTTP server 127.0.0.1:34614 (tcp)
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.646273 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:56.646411 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQueryNearIP (4.76s)
=== CONT  TestDNS_ServiceLookup_TagPeriod
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:56.771922 [WARN] agent: Node name "Node 56d0af7b-9131-b89d-5b72-38ef2eb94bd3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:56.773027 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:56.776203 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
Added 3 service nodes
TestDNS_PreparedQueryNearIP - 2019/12/06 06:04:57.415367 [WARN] consul: error getting server health from "Node ccd6c0f3-de8a-ccd1-08f2-6651b4d05e8e": context deadline exceeded
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.560968 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.281696ms) from client 127.0.0.1:55642 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.589081 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 1.155027ms) from client 127.0.0.1:35013 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.589643 [INFO] agent: Requesting shutdown
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.590892 [INFO] consul: shutting down server
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.591151 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:04:57.617224 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.744480 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.745210 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.745627 [DEBUG] consul: Skipping self join check for "Node 59afbec1-cede-c10a-7b90-761723dd3612" since the cluster is too small
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.745769 [INFO] consul: member 'Node 59afbec1-cede-c10a-7b90-761723dd3612' joined, marking health alive
2019/12/06 06:04:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:56d0af7b-9131-b89d-5b72-38ef2eb94bd3 Address:127.0.0.1:34630}]
2019/12/06 06:04:57 [INFO]  raft: Node at 127.0.0.1:34630 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.830601 [INFO] manager: shutting down
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:57.829155 [INFO] serf: EventMemberJoin: Node 56d0af7b-9131-b89d-5b72-38ef2eb94bd3.dc1 127.0.0.1
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:57.845970 [INFO] serf: EventMemberJoin: Node 56d0af7b-9131-b89d-5b72-38ef2eb94bd3 127.0.0.1
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:57.847251 [INFO] consul: Adding LAN server Node 56d0af7b-9131-b89d-5b72-38ef2eb94bd3 (Addr: tcp/127.0.0.1:34630) (DC: dc1)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:57.847642 [INFO] consul: Handled member-join event for server "Node 56d0af7b-9131-b89d-5b72-38ef2eb94bd3.dc1" in area "wan"
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:57.850592 [INFO] agent: Started DNS server 127.0.0.1:34625 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:57.850811 [INFO] agent: Started DNS server 127.0.0.1:34625 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:57.853412 [INFO] agent: Started HTTP server on 127.0.0.1:34626 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:57.853519 [INFO] agent: started state syncer
2019/12/06 06:04:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:57 [INFO]  raft: Node at 127.0.0.1:34630 [Candidate] entering Candidate state in term 2
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.934155 [ERR] consul: failed to reconcile member: {Node 59afbec1-cede-c10a-7b90-761723dd3612 127.0.0.1 34622 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:59afbec1-cede-c10a-7b90-761723dd3612 port:34624 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:34623] alive 1 5 2 2 5 4}: leadership lost while committing log
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.934261 [INFO] agent: consul server down
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.934878 [INFO] agent: shutdown complete
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.935097 [INFO] agent: Stopping DNS server 127.0.0.1:34619 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.935543 [INFO] agent: Stopping DNS server 127.0.0.1:34619 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.935946 [INFO] agent: Stopping HTTP server 127.0.0.1:34620 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.936387 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQueryNearIPEDNS - 2019/12/06 06:04:57.936584 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQueryNearIPEDNS (4.51s)
=== CONT  TestDNS_CaseInsensitiveServiceLookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.016566 [WARN] agent: Node name "Node 07e54604-c009-3ba8-7ab0-55022edf1eb6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.017076 [DEBUG] tlsutil: Update with version 1
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.020850 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.369946 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 13.80032ms) from client 127.0.0.1:44863 (udp)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.377665 [DEBUG] dns: request for name a5e719f3-e050-024d-21a5-135192b57b4d.query.consul. type ANY class IN (took 7.149833ms) from client 127.0.0.1:51039 (udp)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.377998 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.378241 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.378662 [WARN] serf: Shutdown without a Leave
2019/12/06 06:04:58 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:58 [INFO]  raft: Node at 127.0.0.1:34630 [Leader] entering Leader state
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:58.445672 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:58.446072 [INFO] consul: New leader elected: Node 56d0af7b-9131-b89d-5b72-38ef2eb94bd3
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.521661 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:04:58.618836 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.622615 [INFO] manager: shutting down
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.623497 [INFO] agent: consul server down
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.623580 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.623645 [INFO] agent: Stopping DNS server 127.0.0.1:34553 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.623838 [INFO] agent: Stopping DNS server 127.0.0.1:34553 (udp)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.624043 [INFO] agent: Stopping HTTP server 127.0.0.1:34554 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.624373 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Truncate - 2019/12/06 06:04:58.624463 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Truncate (31.37s)
=== CONT  TestDNS_ServiceLookup_ServiceAddressIPV6
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:58.833216 [WARN] agent: Node name "Node 249f8df1-729e-7236-075f-ddeeb60f7a2a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:58.833793 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:58.843744 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:07e54604-c009-3ba8-7ab0-55022edf1eb6 Address:127.0.0.1:34636}]
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.991067 [INFO] serf: EventMemberJoin: Node 07e54604-c009-3ba8-7ab0-55022edf1eb6.dc1 127.0.0.1
2019/12/06 06:04:58 [INFO]  raft: Node at 127.0.0.1:34636 [Follower] entering Follower state (Leader: "")
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.994782 [INFO] serf: EventMemberJoin: Node 07e54604-c009-3ba8-7ab0-55022edf1eb6 127.0.0.1
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.995881 [INFO] consul: Adding LAN server Node 07e54604-c009-3ba8-7ab0-55022edf1eb6 (Addr: tcp/127.0.0.1:34636) (DC: dc1)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.996049 [INFO] consul: Handled member-join event for server "Node 07e54604-c009-3ba8-7ab0-55022edf1eb6.dc1" in area "wan"
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.997307 [INFO] agent: Started DNS server 127.0.0.1:34631 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:58.997620 [INFO] agent: Started DNS server 127.0.0.1:34631 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:59.000107 [INFO] agent: Started HTTP server on 127.0.0.1:34632 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:59.000213 [INFO] agent: started state syncer
2019/12/06 06:04:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:59 [INFO]  raft: Node at 127.0.0.1:34636 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.172477 [INFO] agent: Synced node info
TestEventFire_token - 2019/12/06 06:04:59.617232 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:04:59 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:59 [INFO]  raft: Node at 127.0.0.1:34636 [Leader] entering Leader state
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:59.651564 [INFO] consul: cluster leadership acquired
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:59.652111 [INFO] consul: New leader elected: Node 07e54604-c009-3ba8-7ab0-55022edf1eb6
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.652624 [DEBUG] dns: request for name v1.master2.db.service.consul. type SRV class IN (took 645.015µs) from client 127.0.0.1:40644 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.654520 [DEBUG] dns: request for name v1.master.db.service.consul. type SRV class IN (took 744.684µs) from client 127.0.0.1:59121 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.654611 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.654700 [INFO] consul: shutting down server
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.654750 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.727862 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.802948 [INFO] manager: shutting down
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.817689 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 7.656844ms) from client 127.0.0.1:54526 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.827244 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 6.728489ms) from client 127.0.0.1:50992 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.839367 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 11.143925ms) from client 127.0.0.1:59093 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.858744 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 7.449173ms) from client 127.0.0.1:59058 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.875393 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 15.783366ms) from client 127.0.0.1:36629 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.894990 [INFO] agent: consul server down
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.895102 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.895171 [INFO] agent: Stopping DNS server 127.0.0.1:34625 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.895346 [INFO] agent: Stopping DNS server 127.0.0.1:34625 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.895546 [INFO] agent: Stopping HTTP server 127.0.0.1:34626 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.895789 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.895877 [INFO] agent: Endpoints down
2019/12/06 06:04:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:249f8df1-729e-7236-075f-ddeeb60f7a2a Address:127.0.0.1:34642}]
--- PASS: TestDNS_ServiceLookup_TagPeriod (3.25s)
=== CONT  TestDNS_ServiceLookup_ServiceAddress_CNAME
TestDNS_ServiceLookup_TagPeriod - 2019/12/06 06:04:59.896602 [ERR] consul: failed to establish leadership: leadership lost while committing log
2019/12/06 06:04:59 [INFO]  raft: Node at 127.0.0.1:34642 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.898908 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 17.270734ms) from client 127.0.0.1:54779 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:59.915325 [INFO] serf: EventMemberJoin: Node 249f8df1-729e-7236-075f-ddeeb60f7a2a.dc1 127.0.0.1
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.940282 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.309789ms) from client 127.0.0.1:35017 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.941714 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 27.224297ms) from client 127.0.0.1:50548 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:59.943764 [INFO] serf: EventMemberJoin: Node 249f8df1-729e-7236-075f-ddeeb60f7a2a 127.0.0.1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:59.944946 [INFO] consul: Adding LAN server Node 249f8df1-729e-7236-075f-ddeeb60f7a2a (Addr: tcp/127.0.0.1:34642) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:59.945283 [INFO] consul: Handled member-join event for server "Node 249f8df1-729e-7236-075f-ddeeb60f7a2a.dc1" in area "wan"
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:59.946234 [INFO] agent: Started DNS server 127.0.0.1:34637 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:59.951691 [INFO] agent: Started DNS server 127.0.0.1:34637 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:59.954316 [INFO] agent: Started HTTP server on 127.0.0.1:34638 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:04:59.962421 [INFO] agent: started state syncer
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.961371 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 18.726433ms) from client 127.0.0.1:51070 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.963795 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 7.45584ms) from client 127.0.0.1:42497 (udp)
2019/12/06 06:04:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:59 [INFO]  raft: Node at 127.0.0.1:34642 [Candidate] entering Candidate state in term 2
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:59.974584 [INFO] agent: Synced node info
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:04:59.974932 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:04:59.988778 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 9.572555ms) from client 127.0.0.1:59184 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.023010 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 32.985431ms) from client 127.0.0.1:60721 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.040824 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 13.753652ms) from client 127.0.0.1:35510 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.046944 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 5.332456ms) from client 127.0.0.1:58789 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.053567 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 5.790801ms) from client 127.0.0.1:32802 (udp)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:00.064309 [WARN] agent: Node name "Node 1051d9e9-1694-3468-c163-d890e68dd8a9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:00.064732 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.065788 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 7.650511ms) from client 127.0.0.1:33842 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:00.070312 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.078921 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 7.75418ms) from client 127.0.0.1:43508 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.095839 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 10.444908ms) from client 127.0.0.1:50646 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.096455 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 5.486127ms) from client 127.0.0.1:52859 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.110228 [DEBUG] dns: request for name 5d53e562-6c42-ef67-a3d5-ac5187eecb8f.query.consul. type ANY class IN (took 5.699132ms) from client 127.0.0.1:34025 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.110352 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.110454 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.110510 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.261123 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.333827 [INFO] manager: shutting down
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.334574 [INFO] agent: consul server down
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.334630 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.334688 [INFO] agent: Stopping DNS server 127.0.0.1:34559 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.334832 [INFO] agent: Stopping DNS server 127.0.0.1:34559 (udp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.335010 [INFO] agent: Stopping HTTP server 127.0.0.1:34560 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.335203 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Randomize - 2019/12/06 06:05:00.335272 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Randomize (30.92s)
=== CONT  TestDNS_ServiceLookup_ServiceAddress_A
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:00.474593 [WARN] agent: Node name "Node e7983b72-77d0-0138-14e7-effbe70be7a0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:00.475314 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:00.477480 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:00.617243 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:00 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:00 [INFO]  raft: Node at 127.0.0.1:34642 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:00.666927 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:00.667432 [INFO] consul: New leader elected: Node 249f8df1-729e-7236-075f-ddeeb60f7a2a
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.959029 [DEBUG] dns: request for name master.db.service.consul. type SRV class IN (took 751.351µs) from client 127.0.0.1:43490 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.966817 [DEBUG] dns: request for name mASTER.dB.service.consul. type SRV class IN (took 804.352µs) from client 127.0.0.1:40594 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.969298 [DEBUG] dns: request for name MASTER.dB.service.consul. type SRV class IN (took 682.016µs) from client 127.0.0.1:48641 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.971151 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 885.354µs) from client 127.0.0.1:45710 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.975986 [DEBUG] dns: request for name DB.service.consul. type SRV class IN (took 2.243718ms) from client 127.0.0.1:33172 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.977757 [DEBUG] dns: request for name Db.service.consul. type SRV class IN (took 889.687µs) from client 127.0.0.1:37772 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.981999 [DEBUG] dns: request for name somequery.query.consul. type SRV class IN (took 1.267362ms) from client 127.0.0.1:39576 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.984156 [DEBUG] dns: request for name SomeQuery.query.consul. type SRV class IN (took 1.036357ms) from client 127.0.0.1:38140 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.986004 [DEBUG] dns: request for name SOMEQUERY.query.consul. type SRV class IN (took 820.019µs) from client 127.0.0.1:52274 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.986331 [INFO] agent: Requesting shutdown
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.986576 [INFO] consul: shutting down server
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:00.986813 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.187419 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.321208 [INFO] manager: shutting down
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.324738 [INFO] agent: consul server down
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.324823 [INFO] agent: shutdown complete
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.324886 [INFO] agent: Stopping DNS server 127.0.0.1:34631 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.324902 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.325062 [INFO] agent: Stopping DNS server 127.0.0.1:34631 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.325251 [INFO] agent: Stopping HTTP server 127.0.0.1:34632 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.325478 [INFO] agent: Waiting for endpoints to shut down
TestDNS_CaseInsensitiveServiceLookup - 2019/12/06 06:05:01.325543 [INFO] agent: Endpoints down
--- PASS: TestDNS_CaseInsensitiveServiceLookup (3.39s)
=== CONT  TestDNS_ExternalServiceToConsulCNAMENestedLookup
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:01.330790 [INFO] agent: Synced node info
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:01.330938 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:01.509683 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:01.516541 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:01.617337 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1051d9e9-1694-3468-c163-d890e68dd8a9 Address:127.0.0.1:34648}]
2019/12/06 06:05:02 [INFO]  raft: Node at 127.0.0.1:34648 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.282480 [INFO] serf: EventMemberJoin: Node 1051d9e9-1694-3468-c163-d890e68dd8a9.dc1 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.286262 [INFO] serf: EventMemberJoin: Node 1051d9e9-1694-3468-c163-d890e68dd8a9 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.286862 [INFO] consul: Handled member-join event for server "Node 1051d9e9-1694-3468-c163-d890e68dd8a9.dc1" in area "wan"
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.287060 [INFO] consul: Adding LAN server Node 1051d9e9-1694-3468-c163-d890e68dd8a9 (Addr: tcp/127.0.0.1:34648) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.287517 [INFO] agent: Started DNS server 127.0.0.1:34643 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.289694 [INFO] agent: Started DNS server 127.0.0.1:34643 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.292728 [INFO] agent: Started HTTP server on 127.0.0.1:34644 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.292835 [INFO] agent: started state syncer
2019/12/06 06:05:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:02 [INFO]  raft: Node at 127.0.0.1:34648 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:02.381957 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:02.617279 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e7983b72-77d0-0138-14e7-effbe70be7a0 Address:127.0.0.1:34654}]
2019/12/06 06:05:02 [INFO]  raft: Node at 127.0.0.1:34654 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:02.737850 [INFO] serf: EventMemberJoin: Node e7983b72-77d0-0138-14e7-effbe70be7a0.dc1 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:02.749005 [INFO] serf: EventMemberJoin: Node e7983b72-77d0-0138-14e7-effbe70be7a0 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:02.750216 [INFO] consul: Adding LAN server Node e7983b72-77d0-0138-14e7-effbe70be7a0 (Addr: tcp/127.0.0.1:34654) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:02.750500 [INFO] consul: Handled member-join event for server "Node e7983b72-77d0-0138-14e7-effbe70be7a0.dc1" in area "wan"
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:02.752668 [INFO] agent: Started DNS server 127.0.0.1:34649 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:02.752820 [INFO] agent: Started DNS server 127.0.0.1:34649 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:02.758169 [INFO] agent: Started HTTP server on 127.0.0.1:34650 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:02.758322 [INFO] agent: started state syncer
2019/12/06 06:05:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:02 [INFO]  raft: Node at 127.0.0.1:34654 [Candidate] entering Candidate state in term 2
2019/12/06 06:05:02 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:02 [INFO]  raft: Node at 127.0.0.1:34648 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.907262 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:02.907903 [INFO] consul: New leader elected: Node 1051d9e9-1694-3468-c163-d890e68dd8a9
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.028113 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.075692ms) from client 127.0.0.1:36824 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.030774 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.030856 [DEBUG] dns: request for name 16996e23-305c-4231-9d81-bbf8c1284d0c.query.consul. type SRV class IN (took 1.497701ms) from client 127.0.0.1:36996 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.030921 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.030995 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.211412 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.394716 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.395618 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.395686 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.395749 [INFO] agent: Stopping DNS server 127.0.0.1:34637 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.395942 [INFO] agent: Stopping DNS server 127.0.0.1:34637 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.396140 [INFO] agent: Stopping HTTP server 127.0.0.1:34638 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.396246 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.396372 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/12/06 06:05:03.396446 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddressIPV6 (4.77s)
=== CONT  TestDNS_NSRecords_IPV6
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:03.536033 [DEBUG] tlsutil: Update with version 1
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:03.538710 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:03.545468 [INFO] agent: Synced node info
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:03.545727 [DEBUG] agent: Node info in sync
2019/12/06 06:05:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1e0f64ea-3c5a-11b5-50b5-c59715ed3f21 Address:127.0.0.1:34660}]
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:03.553256 [INFO] serf: EventMemberJoin: test-node.dc1 127.0.0.1
2019/12/06 06:05:03 [INFO]  raft: Node at 127.0.0.1:34660 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:03.558522 [INFO] serf: EventMemberJoin: test-node 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:03.559401 [INFO] consul: Handled member-join event for server "test-node.dc1" in area "wan"
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:03.559693 [INFO] consul: Adding LAN server test-node (Addr: tcp/127.0.0.1:34660) (DC: dc1)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:03.559964 [INFO] agent: Started DNS server 127.0.0.1:34655 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:03.563465 [INFO] agent: Started DNS server 127.0.0.1:34655 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:03.566369 [INFO] agent: Started HTTP server on 127.0.0.1:34656 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:03.566497 [INFO] agent: started state syncer
2019/12/06 06:05:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:03 [INFO]  raft: Node at 127.0.0.1:34660 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:05:03.617232 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:03 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:03 [INFO]  raft: Node at 127.0.0.1:34654 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:03.786880 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:03.787368 [INFO] consul: New leader elected: Node e7983b72-77d0-0138-14e7-effbe70be7a0
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:03.929787 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:04.379022 [INFO] agent: Synced node info
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:04.382938 [DEBUG] agent: Node info in sync
2019/12/06 06:05:04 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:04 [INFO]  raft: Node at 127.0.0.1:34660 [Leader] entering Leader state
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:04.606932 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:04.607310 [INFO] consul: New leader elected: test-node
TestEventFire_token - 2019/12/06 06:05:04.617234 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:04.889553 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.106692ms) from client 127.0.0.1:44181 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:04.892121 [DEBUG] dns: request for name 7c279c86-6acf-9079-5cb8-5e15118c13f1.query.consul. type SRV class IN (took 1.119026ms) from client 127.0.0.1:55801 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:04.892688 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:04.892763 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:04.892810 [WARN] serf: Shutdown without a Leave
2019/12/06 06:05:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:02fc3eff-b756-f2df-475f-4b889b137bc5 Address:[::1]:34666}]
2019/12/06 06:05:05 [INFO]  raft: Node at [::1]:34666 [Follower] entering Follower state (Leader: "")
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.015073 [INFO] serf: EventMemberJoin: server1.dc1 ::1
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.018360 [INFO] serf: EventMemberJoin: server1 ::1
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.019659 [INFO] consul: Adding LAN server server1 (Addr: tcp/[::1]:34666) (DC: dc1)
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.019767 [INFO] agent: Started DNS server 127.0.0.1:34661 (udp)
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.019696 [INFO] consul: Handled member-join event for server "server1.dc1" in area "wan"
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.020056 [INFO] agent: Started DNS server 127.0.0.1:34661 (tcp)
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.022374 [INFO] agent: Started HTTP server on 127.0.0.1:34662 (tcp)
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.022473 [INFO] agent: started state syncer
2019/12/06 06:05:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:05 [INFO]  raft: Node at [::1]:34666 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.088068 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.204280 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.206238 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.206285 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.206334 [INFO] agent: Stopping DNS server 127.0.0.1:34643 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.206450 [INFO] agent: Stopping DNS server 127.0.0.1:34643 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.206580 [INFO] agent: Stopping HTTP server 127.0.0.1:34644 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.206748 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.206803 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddress_CNAME (5.31s)
=== CONT  TestDNS_ExternalServiceToConsulCNAMELookup
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/12/06 06:05:05.211298 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:05.285396 [WARN] agent: Node name "test node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:05.286167 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:05.288647 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:05.469030 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 888.021µs) from client 127.0.0.1:54692 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:05.471103 [DEBUG] dns: request for name d392caf3-7b54-5c29-8b2c-90f46dc1979a.query.consul. type SRV class IN (took 1.092026ms) from client 127.0.0.1:59631 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:05.471355 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:05.471468 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:05.471540 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:05:05.617248 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:05.655041 [WARN] serf: Shutdown without a Leave
2019/12/06 06:05:05 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:05 [INFO]  raft: Node at [::1]:34666 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:05.854524 [INFO] manager: shutting down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:05.856184 [INFO] agent: Synced node info
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:05.856292 [DEBUG] agent: Node info in sync
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.856393 [INFO] consul: cluster leadership acquired
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:05.856803 [INFO] consul: New leader elected: server1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:05.985735 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.081206 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.081285 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.081348 [INFO] agent: Stopping DNS server 127.0.0.1:34649 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.081499 [INFO] agent: Stopping DNS server 127.0.0.1:34649 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.081675 [INFO] agent: Stopping HTTP server 127.0.0.1:34650 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.081937 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.082016 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddress_A (5.74s)
=== CONT  TestDNS_InifiniteRecursion
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.084446 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_ServiceLookup_ServiceAddress_A - 2019/12/06 06:05:06.084732 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_InifiniteRecursion - 2019/12/06 06:05:06.177219 [WARN] agent: Node name "test node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_InifiniteRecursion - 2019/12/06 06:05:06.177899 [DEBUG] tlsutil: Update with version 1
TestDNS_InifiniteRecursion - 2019/12/06 06:05:06.180649 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:06.429040 [INFO] agent: Synced node info
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:06.429163 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:06.617189 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9f5d540b-0206-cfa6-7a04-7204dffdb9e4 Address:127.0.0.1:34672}]
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:07.123460 [INFO] serf: EventMemberJoin: test node.dc1 127.0.0.1
2019/12/06 06:05:07 [INFO]  raft: Node at 127.0.0.1:34672 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:07.129505 [INFO] serf: EventMemberJoin: test node 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:07.130278 [INFO] consul: Handled member-join event for server "test node.dc1" in area "wan"
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:07.130556 [INFO] consul: Adding LAN server test node (Addr: tcp/127.0.0.1:34672) (DC: dc1)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:07.131282 [INFO] agent: Started DNS server 127.0.0.1:34667 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:07.131360 [INFO] agent: Started DNS server 127.0.0.1:34667 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:07.133775 [INFO] agent: Started HTTP server on 127.0.0.1:34668 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:07.133910 [INFO] agent: started state syncer
2019/12/06 06:05:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:07 [INFO]  raft: Node at 127.0.0.1:34672 [Candidate] entering Candidate state in term 2
2019/12/06 06:05:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a33495e5-4d85-1d2d-7269-63d72ce6df03 Address:127.0.0.1:34678}]
TestDNS_InifiniteRecursion - 2019/12/06 06:05:07.490510 [INFO] serf: EventMemberJoin: test node.dc1 127.0.0.1
2019/12/06 06:05:07 [INFO]  raft: Node at 127.0.0.1:34678 [Follower] entering Follower state (Leader: "")
TestDNS_InifiniteRecursion - 2019/12/06 06:05:07.495698 [INFO] serf: EventMemberJoin: test node 127.0.0.1
TestDNS_InifiniteRecursion - 2019/12/06 06:05:07.497177 [INFO] agent: Started DNS server 127.0.0.1:34673 (udp)
TestDNS_InifiniteRecursion - 2019/12/06 06:05:07.498302 [INFO] consul: Adding LAN server test node (Addr: tcp/127.0.0.1:34678) (DC: dc1)
TestDNS_InifiniteRecursion - 2019/12/06 06:05:07.498544 [INFO] consul: Handled member-join event for server "test node.dc1" in area "wan"
TestDNS_InifiniteRecursion - 2019/12/06 06:05:07.499102 [INFO] agent: Started DNS server 127.0.0.1:34673 (tcp)
TestDNS_InifiniteRecursion - 2019/12/06 06:05:07.502604 [INFO] agent: Started HTTP server on 127.0.0.1:34674 (tcp)
TestDNS_InifiniteRecursion - 2019/12/06 06:05:07.502752 [INFO] agent: started state syncer
2019/12/06 06:05:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:07 [INFO]  raft: Node at 127.0.0.1:34678 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:05:07.617457 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:08 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:08 [INFO]  raft: Node at 127.0.0.1:34672 [Leader] entering Leader state
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.178550 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:08.181814 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:08.182284 [INFO] consul: New leader elected: test node
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.185202 [DEBUG] dns: request for name alias2.service.consul. type SRV class IN (took 1.569037ms) from client 127.0.0.1:39717 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.185998 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.186080 [INFO] consul: shutting down server
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.186126 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.189641 [WARN] consul: error getting server health from "test-node": rpc error making call: EOF
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:08.312996 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:08.313487 [DEBUG] consul: Skipping self join check for "server1" since the cluster is too small
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:08.313654 [INFO] consul: member 'server1' joined, marking health alive
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.313712 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.506623 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.507183 [DEBUG] consul: Skipping self join check for "test-node" since the cluster is too small
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.507346 [INFO] consul: member 'test-node' joined, marking health alive
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.507662 [INFO] manager: shutting down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.507973 [ERR] consul: failed to reconcile member: {test-node 127.0.0.1 34658 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:1e0f64ea-3c5a-11b5-50b5-c59715ed3f21 port:34660 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:34659] alive 1 5 2 2 5 4}: raft is already shutdown
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.508088 [INFO] agent: consul server down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.508132 [INFO] agent: shutdown complete
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.508203 [INFO] agent: Stopping DNS server 127.0.0.1:34655 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.508368 [INFO] agent: Stopping DNS server 127.0.0.1:34655 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.508533 [INFO] agent: Stopping HTTP server 127.0.0.1:34656 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.508745 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:08.508828 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceToConsulCNAMENestedLookup (7.18s)
=== CONT  TestDNS_ExternalServiceLookup
TestEventFire_token - 2019/12/06 06:05:08.617180 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:08.625108 [WARN] agent: Node name "Node e96ace92-ca4b-285b-e5c4-b4406b0deaab" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:08.627277 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:08.636328 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:05:08 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:08 [INFO]  raft: Node at 127.0.0.1:34678 [Leader] entering Leader state
TestDNS_InifiniteRecursion - 2019/12/06 06:05:08.744400 [INFO] consul: cluster leadership acquired
TestDNS_InifiniteRecursion - 2019/12/06 06:05:08.744856 [INFO] consul: New leader elected: test node
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:08.760041 [DEBUG] dns: request for name server1.node.dc1.consul. type NS class IN (took 897.354µs) from client 127.0.0.1:41260 (udp)
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:08.760292 [INFO] agent: Requesting shutdown
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:08.760363 [INFO] consul: shutting down server
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:08.760411 [WARN] serf: Shutdown without a Leave
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:08.929413 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:08.930780 [INFO] agent: Synced node info
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:08.930918 [DEBUG] agent: Node info in sync
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:09.053129 [INFO] manager: shutting down
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:09.053704 [INFO] agent: consul server down
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:09.053766 [INFO] agent: shutdown complete
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:09.053828 [INFO] agent: Stopping DNS server 127.0.0.1:34661 (tcp)
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:09.054014 [INFO] agent: Stopping DNS server 127.0.0.1:34661 (udp)
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:09.054276 [INFO] agent: Stopping HTTP server 127.0.0.1:34662 (tcp)
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:09.054510 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NSRecords_IPV6 - 2019/12/06 06:05:09.054587 [INFO] agent: Endpoints down
--- PASS: TestDNS_NSRecords_IPV6 (5.66s)
=== CONT  TestDNS_ConnectServiceLookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/12/06 06:05:09.180032 [WARN] consul: error getting server health from "test-node": context deadline exceeded
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:09.187934 [WARN] agent: Node name "Node e3c9ac80-e074-e8e9-0848-08b4eb7e806a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:09.189974 [DEBUG] tlsutil: Update with version 1
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:09.233164 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_InifiniteRecursion - 2019/12/06 06:05:09.425031 [INFO] agent: Synced node info
TestDNS_InifiniteRecursion - 2019/12/06 06:05:09.425172 [DEBUG] agent: Node info in sync
TestDNS_InifiniteRecursion - 2019/12/06 06:05:09.446036 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:09.617238 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_InifiniteRecursion - 2019/12/06 06:05:09.982838 [ERR] dns: Infinite recursion detected for web.service.consul., won't perform any CNAME resolution.
TestDNS_InifiniteRecursion - 2019/12/06 06:05:09.983252 [DEBUG] dns: request for name web.service.consul. type A class IN (took 1.670039ms) from client 127.0.0.1:59291 (udp)
TestDNS_InifiniteRecursion - 2019/12/06 06:05:09.983530 [INFO] agent: Requesting shutdown
TestDNS_InifiniteRecursion - 2019/12/06 06:05:09.983592 [INFO] consul: shutting down server
TestDNS_InifiniteRecursion - 2019/12/06 06:05:09.983634 [WARN] serf: Shutdown without a Leave
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.078066 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.091195 [DEBUG] agent: Node info in sync
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.228058 [INFO] manager: shutting down
2019/12/06 06:05:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e96ace92-ca4b-285b-e5c4-b4406b0deaab Address:127.0.0.1:34684}]
2019/12/06 06:05:10 [INFO]  raft: Node at 127.0.0.1:34684 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.239958 [DEBUG] dns: request for name alias.service.consul. type SRV class IN (took 1.289697ms) from client 127.0.0.1:48842 (udp)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:10.239913 [INFO] serf: EventMemberJoin: Node e96ace92-ca4b-285b-e5c4-b4406b0deaab.dc1 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.243121 [DEBUG] dns: request for name alias.service.CoNsUl. type SRV class IN (took 1.098359ms) from client 127.0.0.1:51760 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.243245 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.243331 [INFO] consul: shutting down server
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.243383 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:10.251712 [INFO] serf: EventMemberJoin: Node e96ace92-ca4b-285b-e5c4-b4406b0deaab 127.0.0.1
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:10.252631 [INFO] consul: Adding LAN server Node e96ace92-ca4b-285b-e5c4-b4406b0deaab (Addr: tcp/127.0.0.1:34684) (DC: dc1)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:10.252642 [INFO] consul: Handled member-join event for server "Node e96ace92-ca4b-285b-e5c4-b4406b0deaab.dc1" in area "wan"
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:10.253356 [INFO] agent: Started DNS server 127.0.0.1:34679 (tcp)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:10.253430 [INFO] agent: Started DNS server 127.0.0.1:34679 (udp)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:10.255846 [INFO] agent: Started HTTP server on 127.0.0.1:34680 (tcp)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:10.255931 [INFO] agent: started state syncer
2019/12/06 06:05:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:10 [INFO]  raft: Node at 127.0.0.1:34684 [Candidate] entering Candidate state in term 2
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.478089 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.478125 [WARN] serf: Shutdown without a Leave
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.478350 [INFO] agent: consul server down
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.478400 [INFO] agent: shutdown complete
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.478458 [INFO] agent: Stopping DNS server 127.0.0.1:34673 (tcp)
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.478615 [INFO] agent: Stopping DNS server 127.0.0.1:34673 (udp)
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.478825 [INFO] agent: Stopping HTTP server 127.0.0.1:34674 (tcp)
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.479224 [INFO] agent: Waiting for endpoints to shut down
TestDNS_InifiniteRecursion - 2019/12/06 06:05:10.481913 [INFO] agent: Endpoints down
--- PASS: TestDNS_InifiniteRecursion (4.40s)
=== CONT  TestDNS_ServiceLookupWithInternalServiceAddress
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.603109 [INFO] manager: shutting down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.605090 [INFO] agent: consul server down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.605150 [INFO] agent: shutdown complete
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.605232 [INFO] agent: Stopping DNS server 127.0.0.1:34667 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.605394 [INFO] agent: Stopping DNS server 127.0.0.1:34667 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.605564 [INFO] agent: Stopping HTTP server 127.0.0.1:34668 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.605776 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.605857 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceToConsulCNAMELookup (5.40s)
=== CONT  TestDNS_ServiceLookup
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.619739 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestEventFire_token - 2019/12/06 06:05:10.620212 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/12/06 06:05:10.621685 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:10.647263 [WARN] agent: Node name "my.test-node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:10.647966 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:10.661678 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:05:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e3c9ac80-e074-e8e9-0848-08b4eb7e806a Address:127.0.0.1:34690}]
2019/12/06 06:05:10 [INFO]  raft: Node at 127.0.0.1:34690 [Follower] entering Follower state (Leader: "")
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:10.733542 [INFO] serf: EventMemberJoin: Node e3c9ac80-e074-e8e9-0848-08b4eb7e806a.dc1 127.0.0.1
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:10.737589 [INFO] serf: EventMemberJoin: Node e3c9ac80-e074-e8e9-0848-08b4eb7e806a 127.0.0.1
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:10.738608 [INFO] consul: Handled member-join event for server "Node e3c9ac80-e074-e8e9-0848-08b4eb7e806a.dc1" in area "wan"
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:10.738967 [INFO] consul: Adding LAN server Node e3c9ac80-e074-e8e9-0848-08b4eb7e806a (Addr: tcp/127.0.0.1:34690) (DC: dc1)
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:10.739617 [INFO] agent: Started DNS server 127.0.0.1:34685 (udp)
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:10.739826 [INFO] agent: Started DNS server 127.0.0.1:34685 (tcp)
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:10.742333 [INFO] agent: Started HTTP server on 127.0.0.1:34686 (tcp)
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:10.755135 [INFO] agent: started state syncer
2019/12/06 06:05:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:10 [INFO]  raft: Node at 127.0.0.1:34690 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup - 2019/12/06 06:05:10.912243 [WARN] agent: Node name "Node 2d69f2f2-0854-e035-1abd-2b02b41cf809" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup - 2019/12/06 06:05:10.912705 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup - 2019/12/06 06:05:10.919539 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:05:11 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:11 [INFO]  raft: Node at 127.0.0.1:34684 [Leader] entering Leader state
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:11.072125 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:11.072632 [INFO] consul: New leader elected: Node e96ace92-ca4b-285b-e5c4-b4406b0deaab
TestEventFire_token - 2019/12/06 06:05:11.617213 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:11 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:11 [INFO]  raft: Node at 127.0.0.1:34690 [Leader] entering Leader state
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:11.645563 [INFO] agent: Synced node info
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:11.646800 [INFO] consul: cluster leadership acquired
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:11.647215 [INFO] consul: New leader elected: Node e3c9ac80-e074-e8e9-0848-08b4eb7e806a
2019/12/06 06:05:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a9d36a1f-814e-4033-197d-b0a759ad7d50 Address:127.0.0.1:34696}]
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.147938 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 941.355µs) from client 127.0.0.1:36466 (udp)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.148720 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.148810 [INFO] consul: shutting down server
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.148859 [WARN] serf: Shutdown without a Leave
2019/12/06 06:05:12 [INFO]  raft: Node at 127.0.0.1:34696 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:12.148730 [INFO] serf: EventMemberJoin: my.test-node.dc1 127.0.0.1
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:12.150527 [INFO] agent: Synced node info
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:12.153483 [INFO] serf: EventMemberJoin: my.test-node 127.0.0.1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:12.155206 [INFO] agent: Started DNS server 127.0.0.1:34691 (udp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:12.155455 [INFO] consul: Handled member-join event for server "my.test-node.dc1" in area "wan"
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:12.155620 [INFO] consul: Adding LAN server my.test-node (Addr: tcp/127.0.0.1:34696) (DC: dc1)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:12.156071 [INFO] agent: Started DNS server 127.0.0.1:34691 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:12.158961 [INFO] agent: Started HTTP server on 127.0.0.1:34692 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:12.159055 [INFO] agent: started state syncer
2019/12/06 06:05:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:12 [INFO]  raft: Node at 127.0.0.1:34696 [Candidate] entering Candidate state in term 2
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:12.328363 [DEBUG] agent: Node info in sync
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:12.328493 [DEBUG] agent: Node info in sync
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.388087 [WARN] serf: Shutdown without a Leave
2019/12/06 06:05:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2d69f2f2-0854-e035-1abd-2b02b41cf809 Address:127.0.0.1:34702}]
2019/12/06 06:05:12 [INFO]  raft: Node at 127.0.0.1:34702 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup - 2019/12/06 06:05:12.393670 [INFO] serf: EventMemberJoin: Node 2d69f2f2-0854-e035-1abd-2b02b41cf809.dc1 127.0.0.1
TestDNS_ServiceLookup - 2019/12/06 06:05:12.401955 [INFO] serf: EventMemberJoin: Node 2d69f2f2-0854-e035-1abd-2b02b41cf809 127.0.0.1
TestDNS_ServiceLookup - 2019/12/06 06:05:12.402817 [INFO] consul: Handled member-join event for server "Node 2d69f2f2-0854-e035-1abd-2b02b41cf809.dc1" in area "wan"
TestDNS_ServiceLookup - 2019/12/06 06:05:12.403059 [INFO] consul: Adding LAN server Node 2d69f2f2-0854-e035-1abd-2b02b41cf809 (Addr: tcp/127.0.0.1:34702) (DC: dc1)
TestDNS_ServiceLookup - 2019/12/06 06:05:12.406028 [INFO] agent: Started DNS server 127.0.0.1:34697 (tcp)
TestDNS_ServiceLookup - 2019/12/06 06:05:12.406899 [INFO] agent: Started DNS server 127.0.0.1:34697 (udp)
TestDNS_ServiceLookup - 2019/12/06 06:05:12.421706 [INFO] agent: Started HTTP server on 127.0.0.1:34698 (tcp)
TestDNS_ServiceLookup - 2019/12/06 06:05:12.421855 [INFO] agent: started state syncer
2019/12/06 06:05:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:12 [INFO]  raft: Node at 127.0.0.1:34702 [Candidate] entering Candidate state in term 2
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.533521 [INFO] manager: shutting down
TestEventFire_token - 2019/12/06 06:05:12.617309 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.903571 [INFO] agent: consul server down
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.903668 [INFO] agent: shutdown complete
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.903848 [INFO] agent: Stopping DNS server 127.0.0.1:34679 (tcp)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.904071 [INFO] agent: Stopping DNS server 127.0.0.1:34679 (udp)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.904320 [INFO] agent: Stopping HTTP server 127.0.0.1:34680 (tcp)
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.904545 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.904621 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceLookup (4.40s)
=== CONT  TestDNS_ServiceLookupMultiAddrNoCNAME
TestDNS_ExternalServiceLookup - 2019/12/06 06:05:12.908769 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:13.058383 [WARN] agent: Node name "Node f54264f3-27b7-33ca-0b7c-03fd8de5e5aa" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:13.059023 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:13.061450 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:05:13 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:13 [INFO]  raft: Node at 127.0.0.1:34696 [Leader] entering Leader state
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:13.137562 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:13.138264 [INFO] consul: New leader elected: my.test-node
2019/12/06 06:05:13 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:13 [INFO]  raft: Node at 127.0.0.1:34702 [Leader] entering Leader state
TestDNS_ServiceLookup - 2019/12/06 06:05:13.379353 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup - 2019/12/06 06:05:13.379795 [INFO] consul: New leader elected: Node 2d69f2f2-0854-e035-1abd-2b02b41cf809
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.382842 [DEBUG] dns: request for name db.connect.consul. type SRV class IN (took 927.021µs) from client 127.0.0.1:56982 (udp)
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.383485 [INFO] agent: Requesting shutdown
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.383556 [INFO] consul: shutting down server
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.383607 [WARN] serf: Shutdown without a Leave
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.512210 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:05:13.617322 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.631739 [INFO] manager: shutting down
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.632214 [INFO] agent: consul server down
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.632265 [INFO] agent: shutdown complete
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.632320 [INFO] agent: Stopping DNS server 127.0.0.1:34685 (tcp)
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.632455 [INFO] agent: Stopping DNS server 127.0.0.1:34685 (udp)
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.632603 [INFO] agent: Stopping HTTP server 127.0.0.1:34686 (tcp)
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.632806 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.632881 [INFO] agent: Endpoints down
--- PASS: TestDNS_ConnectServiceLookup (4.58s)
=== CONT  TestDNS_ServiceLookupPreferNoCNAME
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.653012 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.653380 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ConnectServiceLookup - 2019/12/06 06:05:13.653453 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:13.740806 [WARN] agent: Node name "Node 7920d7b3-1544-0d8d-5667-ccb97f586f74" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:13.741527 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:13.745412 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:13.754088 [INFO] agent: Synced node info
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:13.754265 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup - 2019/12/06 06:05:13.887426 [INFO] agent: Synced node info
jones - 2019/12/06 06:05:14.110084 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:05:14.110166 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.297956 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.271696ms) from client 127.0.0.1:36540 (udp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.299466 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.299559 [INFO] consul: shutting down server
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.299609 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.419704 [WARN] serf: Shutdown without a Leave
2019/12/06 06:05:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f54264f3-27b7-33ca-0b7c-03fd8de5e5aa Address:127.0.0.1:34708}]
2019/12/06 06:05:14 [INFO]  raft: Node at 127.0.0.1:34708 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:14.423649 [INFO] serf: EventMemberJoin: Node f54264f3-27b7-33ca-0b7c-03fd8de5e5aa.dc1 127.0.0.1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:14.430858 [INFO] serf: EventMemberJoin: Node f54264f3-27b7-33ca-0b7c-03fd8de5e5aa 127.0.0.1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:14.431693 [INFO] consul: Adding LAN server Node f54264f3-27b7-33ca-0b7c-03fd8de5e5aa (Addr: tcp/127.0.0.1:34708) (DC: dc1)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:14.431793 [INFO] consul: Handled member-join event for server "Node f54264f3-27b7-33ca-0b7c-03fd8de5e5aa.dc1" in area "wan"
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:14.437222 [INFO] agent: Started DNS server 127.0.0.1:34703 (udp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:14.437313 [INFO] agent: Started DNS server 127.0.0.1:34703 (tcp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:14.439768 [INFO] agent: Started HTTP server on 127.0.0.1:34704 (tcp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:14.439876 [INFO] agent: started state syncer
2019/12/06 06:05:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:14 [INFO]  raft: Node at 127.0.0.1:34708 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.573847 [INFO] manager: shutting down
TestEventFire_token - 2019/12/06 06:05:14.617271 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookup - 2019/12/06 06:05:14.721856 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup - 2019/12/06 06:05:14.721989 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.761894 [INFO] agent: consul server down
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762014 [INFO] agent: shutdown complete
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762082 [INFO] agent: Stopping DNS server 127.0.0.1:34691 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762278 [INFO] agent: Stopping DNS server 127.0.0.1:34691 (udp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762352 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762468 [INFO] agent: Stopping HTTP server 127.0.0.1:34692 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762539 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762604 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762685 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/12/06 06:05:14.762753 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookupWithInternalServiceAddress (4.28s)
=== CONT  TestDNS_ServiceReverseLookupNodeAddress
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:14.888444 [WARN] agent: Node name "Node 65a7e3e5-8e34-6e9b-9aea-b87b416f0c93" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:14.888965 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:14.891232 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:05:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7920d7b3-1544-0d8d-5667-ccb97f586f74 Address:127.0.0.1:34714}]
2019/12/06 06:05:15 [INFO]  raft: Node at 127.0.0.1:34714 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup - 2019/12/06 06:05:15.181611 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.278696ms) from client 127.0.0.1:39543 (udp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:15.183086 [INFO] serf: EventMemberJoin: Node 7920d7b3-1544-0d8d-5667-ccb97f586f74.dc1 127.0.0.1
TestDNS_ServiceLookup - 2019/12/06 06:05:15.184605 [DEBUG] dns: request for name 4ca9550a-bc3f-6e46-567d-03d14af5ede6.query.consul. type SRV class IN (took 1.282696ms) from client 127.0.0.1:52928 (udp)
TestDNS_ServiceLookup - 2019/12/06 06:05:15.186775 [DEBUG] dns: request for name nodb.service.consul. type SRV class IN (took 1.12836ms) from client 127.0.0.1:51730 (udp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:15.186986 [INFO] serf: EventMemberJoin: Node 7920d7b3-1544-0d8d-5667-ccb97f586f74 127.0.0.1
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:15.187555 [INFO] consul: Adding LAN server Node 7920d7b3-1544-0d8d-5667-ccb97f586f74 (Addr: tcp/127.0.0.1:34714) (DC: dc1)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:15.187757 [INFO] consul: Handled member-join event for server "Node 7920d7b3-1544-0d8d-5667-ccb97f586f74.dc1" in area "wan"
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:15.188269 [INFO] agent: Started DNS server 127.0.0.1:34709 (udp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:15.188358 [INFO] agent: Started DNS server 127.0.0.1:34709 (tcp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:15.191027 [INFO] agent: Started HTTP server on 127.0.0.1:34710 (tcp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:15.191137 [INFO] agent: started state syncer
TestDNS_ServiceLookup - 2019/12/06 06:05:15.195433 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup - 2019/12/06 06:05:15.195530 [INFO] consul: shutting down server
TestDNS_ServiceLookup - 2019/12/06 06:05:15.195580 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup - 2019/12/06 06:05:15.195440 [DEBUG] dns: request for name nope.query.consul. type SRV class IN (took 1.170027ms) from client 127.0.0.1:46082 (udp)
2019/12/06 06:05:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:15 [INFO]  raft: Node at 127.0.0.1:34714 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup - 2019/12/06 06:05:15.436482 [WARN] serf: Shutdown without a Leave
2019/12/06 06:05:15 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:15 [INFO]  raft: Node at 127.0.0.1:34708 [Leader] entering Leader state
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:15.438645 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:15.439155 [INFO] consul: New leader elected: Node f54264f3-27b7-33ca-0b7c-03fd8de5e5aa
TestDNS_ServiceLookup - 2019/12/06 06:05:15.561624 [INFO] manager: shutting down
TestDNS_ServiceLookup - 2019/12/06 06:05:15.563410 [INFO] agent: consul server down
TestDNS_ServiceLookup - 2019/12/06 06:05:15.563497 [INFO] agent: shutdown complete
TestDNS_ServiceLookup - 2019/12/06 06:05:15.563601 [INFO] agent: Stopping DNS server 127.0.0.1:34697 (tcp)
TestDNS_ServiceLookup - 2019/12/06 06:05:15.563799 [INFO] agent: Stopping DNS server 127.0.0.1:34697 (udp)
TestDNS_ServiceLookup - 2019/12/06 06:05:15.564001 [INFO] agent: Stopping HTTP server 127.0.0.1:34698 (tcp)
TestDNS_ServiceLookup - 2019/12/06 06:05:15.564287 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup - 2019/12/06 06:05:15.564371 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup (4.96s)
=== CONT  TestDNS_SOA_Settings
TestDNS_ServiceLookup - 2019/12/06 06:05:15.579159 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_ServiceLookup - 2019/12/06 06:05:15.579692 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceLookup - 2019/12/06 06:05:15.579763 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:15.617245 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_SOA_Settings - 2019/12/06 06:05:15.653745 [WARN] agent: Node name "Node 18683c76-3d85-51de-4e5c-0caa13c1ad61" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_SOA_Settings - 2019/12/06 06:05:15.654472 [DEBUG] tlsutil: Update with version 1
TestDNS_SOA_Settings - 2019/12/06 06:05:15.657239 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:15.954454 [INFO] agent: Synced node info
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:15.954607 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:16.113427 [DEBUG] agent: Node info in sync
2019/12/06 06:05:16 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:16 [INFO]  raft: Node at 127.0.0.1:34714 [Leader] entering Leader state
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:16.212339 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:16.212830 [INFO] consul: New leader elected: Node 7920d7b3-1544-0d8d-5667-ccb97f586f74
TestEventFire_token - 2019/12/06 06:05:16.617464 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:65a7e3e5-8e34-6e9b-9aea-b87b416f0c93 Address:127.0.0.1:34720}]
2019/12/06 06:05:16 [INFO]  raft: Node at 127.0.0.1:34720 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:16.634746 [INFO] serf: EventMemberJoin: Node 65a7e3e5-8e34-6e9b-9aea-b87b416f0c93.dc1 127.0.0.1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:16.638818 [INFO] serf: EventMemberJoin: Node 65a7e3e5-8e34-6e9b-9aea-b87b416f0c93 127.0.0.1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:16.639905 [INFO] consul: Handled member-join event for server "Node 65a7e3e5-8e34-6e9b-9aea-b87b416f0c93.dc1" in area "wan"
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:16.640312 [INFO] consul: Adding LAN server Node 65a7e3e5-8e34-6e9b-9aea-b87b416f0c93 (Addr: tcp/127.0.0.1:34720) (DC: dc1)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:16.647083 [INFO] agent: Started DNS server 127.0.0.1:34715 (tcp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:16.647926 [INFO] agent: Started DNS server 127.0.0.1:34715 (udp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:16.650350 [INFO] agent: Started HTTP server on 127.0.0.1:34716 (tcp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:16.650499 [INFO] agent: started state syncer
2019/12/06 06:05:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:16 [INFO]  raft: Node at 127.0.0.1:34720 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:16.759011 [INFO] agent: Synced node info
2019/12/06 06:05:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:18683c76-3d85-51de-4e5c-0caa13c1ad61 Address:127.0.0.1:34726}]
TestDNS_SOA_Settings - 2019/12/06 06:05:17.542140 [INFO] serf: EventMemberJoin: Node 18683c76-3d85-51de-4e5c-0caa13c1ad61.dc1 127.0.0.1
2019/12/06 06:05:17 [INFO]  raft: Node at 127.0.0.1:34726 [Follower] entering Follower state (Leader: "")
TestDNS_SOA_Settings - 2019/12/06 06:05:17.548809 [INFO] serf: EventMemberJoin: Node 18683c76-3d85-51de-4e5c-0caa13c1ad61 127.0.0.1
TestDNS_SOA_Settings - 2019/12/06 06:05:17.551744 [INFO] consul: Adding LAN server Node 18683c76-3d85-51de-4e5c-0caa13c1ad61 (Addr: tcp/127.0.0.1:34726) (DC: dc1)
TestDNS_SOA_Settings - 2019/12/06 06:05:17.553396 [INFO] consul: Handled member-join event for server "Node 18683c76-3d85-51de-4e5c-0caa13c1ad61.dc1" in area "wan"
TestDNS_SOA_Settings - 2019/12/06 06:05:17.564649 [INFO] agent: Started DNS server 127.0.0.1:34721 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:17.565185 [INFO] agent: Started DNS server 127.0.0.1:34721 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:17.569087 [INFO] agent: Started HTTP server on 127.0.0.1:34722 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:17.569282 [INFO] agent: started state syncer
2019/12/06 06:05:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:17 [INFO]  raft: Node at 127.0.0.1:34726 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:05:17.617438 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:18 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:18 [INFO]  raft: Node at 127.0.0.1:34720 [Leader] entering Leader state
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:18.034971 [INFO] consul: cluster leadership acquired
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:18.035449 [INFO] consul: New leader elected: Node 65a7e3e5-8e34-6e9b-9aea-b87b416f0c93
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.401238 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.405146 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 1.254029ms) from client 127.0.0.1:42337 (udp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.405739 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.405941 [INFO] consul: shutting down server
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.406074 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.528215 [WARN] serf: Shutdown without a Leave
2019/12/06 06:05:18 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:18 [INFO]  raft: Node at 127.0.0.1:34726 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/06 06:05:18.530619 [INFO] consul: cluster leadership acquired
TestDNS_SOA_Settings - 2019/12/06 06:05:18.531094 [INFO] consul: New leader elected: Node 18683c76-3d85-51de-4e5c-0caa13c1ad61
TestEventFire_token - 2019/12/06 06:05:18.617365 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.628382 [INFO] manager: shutting down
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.632504 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.632593 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.632957 [INFO] agent: consul server down
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.633028 [INFO] agent: shutdown complete
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.633086 [INFO] agent: Stopping DNS server 127.0.0.1:34703 (tcp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.633257 [INFO] agent: Stopping DNS server 127.0.0.1:34703 (udp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.633415 [INFO] agent: Stopping HTTP server 127.0.0.1:34704 (tcp)
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.633424 [WARN] consul: error getting server health from "Node f54264f3-27b7-33ca-0b7c-03fd8de5e5aa": rpc error making call: EOF
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.633629 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:18.633698 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookupMultiAddrNoCNAME (5.73s)
=== CONT  TestDNS_ServiceReverseLookup_CustomDomain
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:18.729237 [INFO] agent: Synced node info
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:18.729388 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:18.838451 [WARN] agent: Node name "Node 75454929-503b-f801-cb13-4afdf4c26c74" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:18.839166 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:18.864638 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.040313 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 1.117692ms) from client 127.0.0.1:49580 (udp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.040791 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.040859 [INFO] consul: shutting down server
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.040901 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.174501 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:19.691682 [WARN] consul: error getting server health from "Node f54264f3-27b7-33ca-0b7c-03fd8de5e5aa": context deadline exceeded
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/12/06 06:05:19.691874 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:19.692666 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.707626 [INFO] manager: shutting down
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.710848 [ERR] autopilot: Error updating cluster health: error getting Raft configuration raft is already shutdown
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.711978 [ERR] agent: failed to sync remote state: No cluster leader
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.803378 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.803648 [INFO] agent: consul server down
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.803713 [INFO] agent: shutdown complete
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.803790 [INFO] agent: Stopping DNS server 127.0.0.1:34709 (tcp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.803987 [INFO] agent: Stopping DNS server 127.0.0.1:34709 (udp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.804273 [INFO] agent: Stopping HTTP server 127.0.0.1:34710 (tcp)
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.804574 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookupPreferNoCNAME - 2019/12/06 06:05:19.804658 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookupPreferNoCNAME (6.17s)
=== CONT  TestDNS_ServiceReverseLookup_IPV6
TestDNS_SOA_Settings - 2019/12/06 06:05:19.806057 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/06 06:05:19.812183 [DEBUG] dns: request for name nofoo.node.dc1.consul. type ANY class IN (took 675.349µs) from client 127.0.0.1:46892 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:19.812619 [INFO] agent: Requesting shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:19.812698 [INFO] consul: shutting down server
TestDNS_SOA_Settings - 2019/12/06 06:05:19.812746 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:19.892163 [WARN] agent: Node name "Node 77702ae4-b5db-0c18-d29b-18da6d5692f7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:19.892743 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:19.895853 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:19.916882 [DEBUG] dns: request for {1.0.0.127.in-addr.arpa. 255 1} (502.678µs) from client 127.0.0.1:33615 (udp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:19.916964 [INFO] agent: Requesting shutdown
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:19.917041 [INFO] consul: shutting down server
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:19.917086 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/06 06:05:20.003153 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.005124 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.048517 [DEBUG] agent: Node info in sync
TestDNS_SOA_Settings - 2019/12/06 06:05:20.136674 [INFO] manager: shutting down
TestDNS_SOA_Settings - 2019/12/06 06:05:20.137596 [INFO] agent: consul server down
TestDNS_SOA_Settings - 2019/12/06 06:05:20.137665 [INFO] agent: shutdown complete
TestDNS_SOA_Settings - 2019/12/06 06:05:20.137734 [INFO] agent: Stopping DNS server 127.0.0.1:34721 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:20.137927 [INFO] agent: Stopping DNS server 127.0.0.1:34721 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:20.138123 [INFO] agent: Stopping HTTP server 127.0.0.1:34722 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:20.138355 [INFO] agent: Waiting for endpoints to shut down
TestDNS_SOA_Settings - 2019/12/06 06:05:20.138434 [INFO] agent: Endpoints down
TestDNS_SOA_Settings - 2019/12/06 06:05:20.138809 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:20.139007 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:20.139067 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:20.139121 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:20.139164 [ERR] consul: failed to transfer leadership in 3 attempts
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.139523 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_SOA_Settings - 2019/12/06 06:05:20.215370 [WARN] agent: Node name "Node 98dcbb2e-de6e-44bd-53f3-e35cb351c7b8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_SOA_Settings - 2019/12/06 06:05:20.216193 [DEBUG] tlsutil: Update with version 1
TestDNS_SOA_Settings - 2019/12/06 06:05:20.219024 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.299863 [INFO] agent: consul server down
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.299951 [INFO] agent: shutdown complete
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.300037 [INFO] agent: Stopping DNS server 127.0.0.1:34715 (tcp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.300230 [INFO] agent: Stopping DNS server 127.0.0.1:34715 (udp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.300432 [INFO] agent: Stopping HTTP server 127.0.0.1:34716 (tcp)
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.300649 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.300723 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceReverseLookupNodeAddress (5.54s)
=== CONT  TestDNS_ReverseLookup_IPV6
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.301288 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceReverseLookupNodeAddress - 2019/12/06 06:05:20.301461 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:20.362495 [WARN] agent: Node name "Node 236e77c8-9d58-fbbf-e428-480adea6e91c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:20.362936 [DEBUG] tlsutil: Update with version 1
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:20.365255 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:20.617361 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:05:20.961534 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:05:20.961606 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/12/06 06:05:20.961641 [DEBUG] agent: Node info in sync
2019/12/06 06:05:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:75454929-503b-f801-cb13-4afdf4c26c74 Address:127.0.0.1:34732}]
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:21.302949 [INFO] serf: EventMemberJoin: Node 75454929-503b-f801-cb13-4afdf4c26c74.dc1 127.0.0.1
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:21.310992 [INFO] serf: EventMemberJoin: Node 75454929-503b-f801-cb13-4afdf4c26c74 127.0.0.1
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:21.312646 [INFO] agent: Started DNS server 127.0.0.1:34727 (udp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:21.313036 [INFO] consul: Handled member-join event for server "Node 75454929-503b-f801-cb13-4afdf4c26c74.dc1" in area "wan"
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:21.313062 [INFO] agent: Started DNS server 127.0.0.1:34727 (tcp)
2019/12/06 06:05:21 [INFO]  raft: Node at 127.0.0.1:34732 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:21.313544 [INFO] consul: Adding LAN server Node 75454929-503b-f801-cb13-4afdf4c26c74 (Addr: tcp/127.0.0.1:34732) (DC: dc1)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:21.315826 [INFO] agent: Started HTTP server on 127.0.0.1:34728 (tcp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:21.316196 [INFO] agent: started state syncer
2019/12/06 06:05:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:21 [INFO]  raft: Node at 127.0.0.1:34732 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:05:21.617107 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:77702ae4-b5db-0c18-d29b-18da6d5692f7 Address:127.0.0.1:34738}]
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:21.642583 [INFO] serf: EventMemberJoin: Node 77702ae4-b5db-0c18-d29b-18da6d5692f7.dc1 127.0.0.1
2019/12/06 06:05:21 [INFO]  raft: Node at 127.0.0.1:34738 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:21.647365 [INFO] serf: EventMemberJoin: Node 77702ae4-b5db-0c18-d29b-18da6d5692f7 127.0.0.1
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:21.648685 [INFO] consul: Handled member-join event for server "Node 77702ae4-b5db-0c18-d29b-18da6d5692f7.dc1" in area "wan"
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:21.648867 [INFO] consul: Adding LAN server Node 77702ae4-b5db-0c18-d29b-18da6d5692f7 (Addr: tcp/127.0.0.1:34738) (DC: dc1)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:21.649360 [INFO] agent: Started DNS server 127.0.0.1:34733 (tcp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:21.650046 [INFO] agent: Started DNS server 127.0.0.1:34733 (udp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:21.661702 [INFO] agent: Started HTTP server on 127.0.0.1:34734 (tcp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:21.661816 [INFO] agent: started state syncer
2019/12/06 06:05:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:21 [INFO]  raft: Node at 127.0.0.1:34738 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:05:22.617279 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:236e77c8-9d58-fbbf-e428-480adea6e91c Address:127.0.0.1:34750}]
2019/12/06 06:05:23 [INFO]  raft: Node at 127.0.0.1:34750 [Follower] entering Follower state (Leader: "")
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:23.175392 [INFO] serf: EventMemberJoin: Node 236e77c8-9d58-fbbf-e428-480adea6e91c.dc1 127.0.0.1
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:23.190512 [INFO] serf: EventMemberJoin: Node 236e77c8-9d58-fbbf-e428-480adea6e91c 127.0.0.1
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:23.192330 [INFO] consul: Adding LAN server Node 236e77c8-9d58-fbbf-e428-480adea6e91c (Addr: tcp/127.0.0.1:34750) (DC: dc1)
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:23.193979 [INFO] consul: Handled member-join event for server "Node 236e77c8-9d58-fbbf-e428-480adea6e91c.dc1" in area "wan"
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:23.196328 [INFO] agent: Started DNS server 127.0.0.1:34745 (tcp)
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:23.198298 [INFO] agent: Started DNS server 127.0.0.1:34745 (udp)
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:23.204901 [INFO] agent: Started HTTP server on 127.0.0.1:34746 (tcp)
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:23.205024 [INFO] agent: started state syncer
2019/12/06 06:05:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:23 [INFO]  raft: Node at 127.0.0.1:34750 [Candidate] entering Candidate state in term 2
2019/12/06 06:05:23 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:23 [INFO]  raft: Node at 127.0.0.1:34732 [Leader] entering Leader state
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:23.299618 [INFO] consul: cluster leadership acquired
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:23.300187 [INFO] consul: New leader elected: Node 75454929-503b-f801-cb13-4afdf4c26c74
2019/12/06 06:05:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:98dcbb2e-de6e-44bd-53f3-e35cb351c7b8 Address:127.0.0.1:34744}]
2019/12/06 06:05:23 [INFO]  raft: Node at 127.0.0.1:34744 [Follower] entering Follower state (Leader: "")
TestDNS_SOA_Settings - 2019/12/06 06:05:23.306826 [INFO] serf: EventMemberJoin: Node 98dcbb2e-de6e-44bd-53f3-e35cb351c7b8.dc1 127.0.0.1
TestDNS_SOA_Settings - 2019/12/06 06:05:23.315236 [INFO] serf: EventMemberJoin: Node 98dcbb2e-de6e-44bd-53f3-e35cb351c7b8 127.0.0.1
TestDNS_SOA_Settings - 2019/12/06 06:05:23.316082 [INFO] consul: Adding LAN server Node 98dcbb2e-de6e-44bd-53f3-e35cb351c7b8 (Addr: tcp/127.0.0.1:34744) (DC: dc1)
TestDNS_SOA_Settings - 2019/12/06 06:05:23.316732 [INFO] consul: Handled member-join event for server "Node 98dcbb2e-de6e-44bd-53f3-e35cb351c7b8.dc1" in area "wan"
TestDNS_SOA_Settings - 2019/12/06 06:05:23.318819 [INFO] agent: Started DNS server 127.0.0.1:34739 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:23.319363 [INFO] agent: Started DNS server 127.0.0.1:34739 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:23.322102 [INFO] agent: Started HTTP server on 127.0.0.1:34740 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:23.322203 [INFO] agent: started state syncer
2019/12/06 06:05:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:23 [INFO]  raft: Node at 127.0.0.1:34744 [Candidate] entering Candidate state in term 2
2019/12/06 06:05:23 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:23 [INFO]  raft: Node at 127.0.0.1:34738 [Leader] entering Leader state
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:23.530517 [INFO] consul: cluster leadership acquired
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:23.530995 [INFO] consul: New leader elected: Node 77702ae4-b5db-0c18-d29b-18da6d5692f7
TestEventFire_token - 2019/12/06 06:05:23.617155 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:23.929835 [INFO] agent: Synced node info
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:23.929997 [DEBUG] agent: Node info in sync
2019/12/06 06:05:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:24 [INFO]  raft: Node at 127.0.0.1:34744 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/06 06:05:24.179042 [INFO] consul: cluster leadership acquired
TestDNS_SOA_Settings - 2019/12/06 06:05:24.179599 [INFO] consul: New leader elected: Node 98dcbb2e-de6e-44bd-53f3-e35cb351c7b8
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:24.181456 [INFO] agent: Synced node info
2019/12/06 06:05:24 [INFO]  raft: Node at 127.0.0.1:34750 [Leader] entering Leader state
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:24.184139 [INFO] consul: cluster leadership acquired
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:24.185550 [INFO] consul: New leader elected: Node 236e77c8-9d58-fbbf-e428-480adea6e91c
TestEventFire_token - 2019/12/06 06:05:24.617089 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:24.637387 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/06 06:05:24.637524 [DEBUG] agent: Node info in sync
TestDNS_SOA_Settings - 2019/12/06 06:05:24.661529 [DEBUG] dns: request for name nofoo.node.dc1.consul. type ANY class IN (took 448.011µs) from client 127.0.0.1:58033 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:24.661869 [INFO] agent: Requesting shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:24.661943 [INFO] consul: shutting down server
TestDNS_SOA_Settings - 2019/12/06 06:05:24.661988 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/06 06:05:24.855031 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:24.857083 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/06 06:05:24.961967 [INFO] manager: shutting down
TestDNS_SOA_Settings - 2019/12/06 06:05:24.965188 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:24.965729 [INFO] agent: consul server down
TestDNS_SOA_Settings - 2019/12/06 06:05:24.965776 [INFO] agent: shutdown complete
TestDNS_SOA_Settings - 2019/12/06 06:05:24.965825 [INFO] agent: Stopping DNS server 127.0.0.1:34739 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:24.965947 [INFO] agent: Stopping DNS server 127.0.0.1:34739 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:24.966084 [INFO] agent: Stopping HTTP server 127.0.0.1:34740 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:24.966251 [INFO] agent: Waiting for endpoints to shut down
TestDNS_SOA_Settings - 2019/12/06 06:05:24.966313 [INFO] agent: Endpoints down
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_SOA_Settings - 2019/12/06 06:05:25.023644 [WARN] agent: Node name "Node d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_SOA_Settings - 2019/12/06 06:05:25.024008 [DEBUG] tlsutil: Update with version 1
TestDNS_SOA_Settings - 2019/12/06 06:05:25.026075 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:25.214433 [DEBUG] agent: Node info in sync
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:25.214649 [DEBUG] agent: Node info in sync
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:25.249688 [DEBUG] agent: Node info in sync
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:25.249780 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:25.617106 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:25.681254 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (606.014µs) from client 127.0.0.1:41593 (udp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:25.681325 [INFO] agent: Requesting shutdown
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:25.681460 [INFO] consul: shutting down server
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:25.681583 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:25.844936 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:26.003333 [INFO] manager: shutting down
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:26.504529 [ERR] agent: failed to sync remote state: No cluster leader
TestEventFire_token - 2019/12/06 06:05:26.626366 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:26.627371 [DEBUG] dns: request for {2.4.2.4.2.4.2.4.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.ip6.arpa. 255 1} (1.045691ms) from client 127.0.0.1:40264 (udp)
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:26.627742 [INFO] agent: Requesting shutdown
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:26.627845 [INFO] consul: shutting down server
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:26.627907 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:26.832352 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.028582 [INFO] manager: shutting down
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:27.029606 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.029705 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:27.029766 [INFO] agent: consul server down
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:27.029811 [INFO] agent: shutdown complete
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:27.029868 [INFO] agent: Stopping DNS server 127.0.0.1:34727 (tcp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:27.030036 [INFO] agent: Stopping DNS server 127.0.0.1:34727 (udp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:27.030247 [INFO] agent: Stopping HTTP server 127.0.0.1:34728 (tcp)
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:27.030514 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceReverseLookup_CustomDomain - 2019/12/06 06:05:27.030594 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceReverseLookup_CustomDomain (8.40s)
=== CONT  TestCoordinate_Update
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.037089 [DEBUG] dns: request for {9.2.3.8.2.4.0.0.0.0.f.f.0.0.0.0.0.0.0.0.0.0.0.0.8.b.d.0.1.0.0.2.ip6.arpa. 255 1} (851.02µs) from client 127.0.0.1:36068 (udp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.042523 [INFO] agent: Requesting shutdown
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.042625 [INFO] consul: shutting down server
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.042679 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Update - 2019/12/06 06:05:27.098512 [WARN] agent: Node name "Node cbb60d59-2fe3-b186-e279-7d41b8096d95" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Update - 2019/12/06 06:05:27.098915 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Update - 2019/12/06 06:05:27.101362 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.155595 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.155836 [INFO] agent: consul server down
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.155889 [INFO] agent: shutdown complete
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.155946 [INFO] agent: Stopping DNS server 127.0.0.1:34745 (tcp)
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.156103 [INFO] agent: Stopping DNS server 127.0.0.1:34745 (udp)
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.156274 [INFO] agent: Stopping HTTP server 127.0.0.1:34746 (tcp)
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.156548 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.156642 [INFO] agent: Endpoints down
--- PASS: TestDNS_ReverseLookup_IPV6 (6.86s)
=== CONT  TestDNS_ReverseLookup_CustomDomain
TestDNS_ReverseLookup_IPV6 - 2019/12/06 06:05:27.161746 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:27.228111 [WARN] agent: Node name "Node b42d58f2-0ece-d61a-c7fb-7621e5637a09" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:27.228609 [DEBUG] tlsutil: Update with version 1
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:27.230948 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.398457 [INFO] manager: shutting down
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.399310 [INFO] agent: consul server down
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.399379 [INFO] agent: shutdown complete
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.399437 [INFO] agent: Stopping DNS server 127.0.0.1:34733 (tcp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.399592 [INFO] agent: Stopping DNS server 127.0.0.1:34733 (udp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.399752 [INFO] agent: Stopping HTTP server 127.0.0.1:34734 (tcp)
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.399980 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.400056 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceReverseLookup_IPV6 (7.60s)
=== CONT  TestDNS_ReverseLookup
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.417416 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.418026 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.418111 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.418168 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_ServiceReverseLookup_IPV6 - 2019/12/06 06:05:27.418226 [ERR] consul: failed to transfer leadership in 3 attempts
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ReverseLookup - 2019/12/06 06:05:27.592814 [WARN] agent: Node name "Node dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ReverseLookup - 2019/12/06 06:05:27.593297 [DEBUG] tlsutil: Update with version 1
TestDNS_ReverseLookup - 2019/12/06 06:05:27.595595 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:27.617277 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc Address:127.0.0.1:34756}]
2019/12/06 06:05:28 [INFO]  raft: Node at 127.0.0.1:34756 [Follower] entering Follower state (Leader: "")
2019/12/06 06:05:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:28 [INFO]  raft: Node at 127.0.0.1:34756 [Candidate] entering Candidate state in term 2
TestDNS_SOA_Settings - 2019/12/06 06:05:28.511544 [WARN] raft: Unable to get address for server id d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc, using fallback address 127.0.0.1:34756: Could not find address for server id d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc
TestDNS_SOA_Settings - 2019/12/06 06:05:28.587247 [INFO] serf: EventMemberJoin: Node d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc.dc1 127.0.0.1
TestDNS_SOA_Settings - 2019/12/06 06:05:28.593964 [INFO] serf: EventMemberJoin: Node d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc 127.0.0.1
TestDNS_SOA_Settings - 2019/12/06 06:05:28.595839 [INFO] consul: Adding LAN server Node d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc (Addr: tcp/127.0.0.1:34756) (DC: dc1)
TestDNS_SOA_Settings - 2019/12/06 06:05:28.596293 [INFO] consul: Handled member-join event for server "Node d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc.dc1" in area "wan"
TestDNS_SOA_Settings - 2019/12/06 06:05:28.599343 [INFO] agent: Started DNS server 127.0.0.1:34751 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:28.599469 [INFO] agent: Started DNS server 127.0.0.1:34751 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:28.607015 [INFO] agent: Started HTTP server on 127.0.0.1:34752 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:28.607128 [INFO] agent: started state syncer
TestEventFire_token - 2019/12/06 06:05:28.617550 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cbb60d59-2fe3-b186-e279-7d41b8096d95 Address:127.0.0.1:34762}]
2019/12/06 06:05:28 [INFO]  raft: Node at 127.0.0.1:34762 [Follower] entering Follower state (Leader: "")
TestCoordinate_Update - 2019/12/06 06:05:28.691975 [INFO] serf: EventMemberJoin: Node cbb60d59-2fe3-b186-e279-7d41b8096d95.dc1 127.0.0.1
TestCoordinate_Update - 2019/12/06 06:05:28.695841 [INFO] serf: EventMemberJoin: Node cbb60d59-2fe3-b186-e279-7d41b8096d95 127.0.0.1
TestCoordinate_Update - 2019/12/06 06:05:28.696977 [INFO] consul: Adding LAN server Node cbb60d59-2fe3-b186-e279-7d41b8096d95 (Addr: tcp/127.0.0.1:34762) (DC: dc1)
TestCoordinate_Update - 2019/12/06 06:05:28.697513 [INFO] consul: Handled member-join event for server "Node cbb60d59-2fe3-b186-e279-7d41b8096d95.dc1" in area "wan"
TestCoordinate_Update - 2019/12/06 06:05:28.698741 [INFO] agent: Started DNS server 127.0.0.1:34757 (udp)
TestCoordinate_Update - 2019/12/06 06:05:28.698816 [INFO] agent: Started DNS server 127.0.0.1:34757 (tcp)
TestCoordinate_Update - 2019/12/06 06:05:28.705932 [INFO] agent: Started HTTP server on 127.0.0.1:34758 (tcp)
TestCoordinate_Update - 2019/12/06 06:05:28.706401 [INFO] agent: started state syncer
2019/12/06 06:05:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:28 [INFO]  raft: Node at 127.0.0.1:34762 [Candidate] entering Candidate state in term 2
2019/12/06 06:05:29 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:29 [INFO]  raft: Node at 127.0.0.1:34756 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/06 06:05:29.429026 [INFO] consul: cluster leadership acquired
TestDNS_SOA_Settings - 2019/12/06 06:05:29.429952 [INFO] consul: New leader elected: Node d5b5fb69-8b09-43b1-6ed9-cbeca131d8fc
jones - 2019/12/06 06:05:29.525000 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:05:29.525087 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:29.617147 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b42d58f2-0ece-d61a-c7fb-7621e5637a09 Address:127.0.0.1:34768}]
2019/12/06 06:05:29 [INFO]  raft: Node at 127.0.0.1:34768 [Follower] entering Follower state (Leader: "")
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:29.668566 [INFO] serf: EventMemberJoin: Node b42d58f2-0ece-d61a-c7fb-7621e5637a09.dc1 127.0.0.1
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:29.680284 [INFO] serf: EventMemberJoin: Node b42d58f2-0ece-d61a-c7fb-7621e5637a09 127.0.0.1
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:29.684261 [INFO] consul: Adding LAN server Node b42d58f2-0ece-d61a-c7fb-7621e5637a09 (Addr: tcp/127.0.0.1:34768) (DC: dc1)
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:29.684844 [INFO] consul: Handled member-join event for server "Node b42d58f2-0ece-d61a-c7fb-7621e5637a09.dc1" in area "wan"
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:29.690135 [INFO] agent: Started DNS server 127.0.0.1:34763 (tcp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:29.690216 [INFO] agent: Started DNS server 127.0.0.1:34763 (udp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:29.693208 [INFO] agent: Started HTTP server on 127.0.0.1:34764 (tcp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:29.693297 [INFO] agent: started state syncer
2019/12/06 06:05:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:29 [INFO]  raft: Node at 127.0.0.1:34768 [Candidate] entering Candidate state in term 2
TestDNS_SOA_Settings - 2019/12/06 06:05:29.921109 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/06 06:05:29.933438 [INFO] agent: Requesting shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:29.933563 [INFO] consul: shutting down server
TestDNS_SOA_Settings - 2019/12/06 06:05:29.933615 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/06 06:05:29.936401 [DEBUG] dns: request for name nofoo.node.dc1.consul. type ANY class IN (took 4.582439ms) from client 127.0.0.1:51284 (udp)
2019/12/06 06:05:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a Address:127.0.0.1:34774}]
2019/12/06 06:05:30 [INFO]  raft: Node at 127.0.0.1:34774 [Follower] entering Follower state (Leader: "")
TestDNS_ReverseLookup - 2019/12/06 06:05:30.195202 [INFO] serf: EventMemberJoin: Node dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a.dc1 127.0.0.1
TestDNS_SOA_Settings - 2019/12/06 06:05:30.196278 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup - 2019/12/06 06:05:30.200073 [INFO] serf: EventMemberJoin: Node dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a 127.0.0.1
TestDNS_ReverseLookup - 2019/12/06 06:05:30.201065 [INFO] consul: Adding LAN server Node dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a (Addr: tcp/127.0.0.1:34774) (DC: dc1)
TestDNS_ReverseLookup - 2019/12/06 06:05:30.201366 [INFO] consul: Handled member-join event for server "Node dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a.dc1" in area "wan"
TestDNS_ReverseLookup - 2019/12/06 06:05:30.201613 [INFO] agent: Started DNS server 127.0.0.1:34769 (udp)
TestDNS_ReverseLookup - 2019/12/06 06:05:30.201851 [INFO] agent: Started DNS server 127.0.0.1:34769 (tcp)
TestDNS_ReverseLookup - 2019/12/06 06:05:30.204343 [INFO] agent: Started HTTP server on 127.0.0.1:34770 (tcp)
TestDNS_ReverseLookup - 2019/12/06 06:05:30.204870 [INFO] agent: started state syncer
TestDNS_SOA_Settings - 2019/12/06 06:05:30.231027 [DEBUG] agent: Node info in sync
TestDNS_SOA_Settings - 2019/12/06 06:05:30.231137 [DEBUG] agent: Node info in sync
2019/12/06 06:05:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:30 [INFO]  raft: Node at 127.0.0.1:34774 [Candidate] entering Candidate state in term 2
TestDNS_SOA_Settings - 2019/12/06 06:05:30.438990 [INFO] manager: shutting down
TestDNS_SOA_Settings - 2019/12/06 06:05:30.439777 [INFO] agent: consul server down
TestDNS_SOA_Settings - 2019/12/06 06:05:30.439863 [INFO] agent: shutdown complete
TestDNS_SOA_Settings - 2019/12/06 06:05:30.439922 [INFO] agent: Stopping DNS server 127.0.0.1:34751 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:30.439781 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:30.440082 [INFO] agent: Stopping DNS server 127.0.0.1:34751 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:30.440278 [INFO] agent: Stopping HTTP server 127.0.0.1:34752 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:30.440494 [INFO] agent: Waiting for endpoints to shut down
TestDNS_SOA_Settings - 2019/12/06 06:05:30.440561 [INFO] agent: Endpoints down
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_SOA_Settings - 2019/12/06 06:05:30.551985 [WARN] agent: Node name "Node 5a6c8e34-0535-5d63-7e79-e0f886da3551" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_SOA_Settings - 2019/12/06 06:05:30.552696 [DEBUG] tlsutil: Update with version 1
TestDNS_SOA_Settings - 2019/12/06 06:05:30.557792 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:30.617646 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:31.617313 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:32.617250 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:33.617259 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:34 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:34 [INFO]  raft: Node at 127.0.0.1:34768 [Leader] entering Leader state
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:34.479742 [INFO] consul: cluster leadership acquired
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:34.480246 [INFO] consul: New leader elected: Node b42d58f2-0ece-d61a-c7fb-7621e5637a09
TestEventFire_token - 2019/12/06 06:05:34.617182 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:35.617636 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestCoordinate_Update - 2019/12/06 06:05:35.878443 [ERR] agent: failed to sync remote state: No cluster leader
--- FAIL: TestCoordinate_Update (8.90s)
    retry.go:121: testagent.go:210: TestCoordinate_UpdateCatalog.ListNodes failed:No cluster leader
        
=== CONT  TestDNS_EDNS0_ECS
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_EDNS0_ECS - 2019/12/06 06:05:35.980682 [WARN] agent: Node name "Node 0b43c023-9fbf-8988-754a-23a7e8386d43" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_EDNS0_ECS - 2019/12/06 06:05:35.981114 [DEBUG] tlsutil: Update with version 1
TestDNS_EDNS0_ECS - 2019/12/06 06:05:35.983315 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:36.617193 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
--- FAIL: TestDNS_ReverseLookup (10.10s)
    retry.go:121: testagent.go:210: TestDNS_ReverseLookupCatalog.ListNodes failed:No cluster leader
        
=== CONT  TestDNS_EDNS0
TestDNS_ReverseLookup - 2019/12/06 06:05:37.502018 [ERR] agent: failed to sync remote state: No cluster leader
TestEventFire_token - 2019/12/06 06:05:37.617777 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_EDNS0 - 2019/12/06 06:05:37.629086 [WARN] agent: Node name "Node e2f7c3a2-7591-1a27-bed0-1695cd141eef" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_EDNS0 - 2019/12/06 06:05:37.629867 [DEBUG] tlsutil: Update with version 1
TestDNS_EDNS0 - 2019/12/06 06:05:37.632406 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:38.617603 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:39.617162 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:05:40.138737 [DEBUG] consul: Skipping self join check for "Node a4d824d2-6dd5-669c-d5c6-f0df82645ad2" since the cluster is too small
TestEventFire_token - 2019/12/06 06:05:40.617442 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:40 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:40 [INFO]  raft: Node at 127.0.0.1:34774 [Leader] entering Leader state
TestDNS_ReverseLookup - 2019/12/06 06:05:40.673122 [INFO] consul: cluster leadership acquired
TestDNS_ReverseLookup - 2019/12/06 06:05:40.673777 [INFO] consul: New leader elected: Node dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a
jones - 2019/12/06 06:05:41.082570 [DEBUG] consul: Skipping self join check for "Node e7ff374a-e051-0dd1-c5b8-1cb2efc19141" since the cluster is too small
2019/12/06 06:05:41 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:41 [INFO]  raft: Node at 127.0.0.1:34762 [Leader] entering Leader state
TestCoordinate_Update - 2019/12/06 06:05:41.085751 [INFO] consul: cluster leadership acquired
TestCoordinate_Update - 2019/12/06 06:05:41.086512 [INFO] consul: New leader elected: Node cbb60d59-2fe3-b186-e279-7d41b8096d95
jones - 2019/12/06 06:05:41.415631 [DEBUG] consul: Skipping self join check for "Node 69e40561-243a-545f-340c-f7edd80028d7" since the cluster is too small
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:41.417068 [INFO] agent: Synced node info
jones - 2019/12/06 06:05:41.535507 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:05:41.535586 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:41.617095 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:05:42.088280 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:05:42.088354 [DEBUG] agent: Node info in sync
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:42.120032 [DEBUG] agent: Node info in sync
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:42.120158 [DEBUG] agent: Node info in sync
2019/12/06 06:05:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5a6c8e34-0535-5d63-7e79-e0f886da3551 Address:127.0.0.1:34780}]
2019/12/06 06:05:42 [INFO]  raft: Node at 127.0.0.1:34780 [Follower] entering Follower state (Leader: "")
TestDNS_SOA_Settings - 2019/12/06 06:05:42.432641 [INFO] serf: EventMemberJoin: Node 5a6c8e34-0535-5d63-7e79-e0f886da3551.dc1 127.0.0.1
TestDNS_SOA_Settings - 2019/12/06 06:05:42.435668 [INFO] serf: EventMemberJoin: Node 5a6c8e34-0535-5d63-7e79-e0f886da3551 127.0.0.1
TestDNS_SOA_Settings - 2019/12/06 06:05:42.436477 [INFO] consul: Handled member-join event for server "Node 5a6c8e34-0535-5d63-7e79-e0f886da3551.dc1" in area "wan"
TestDNS_SOA_Settings - 2019/12/06 06:05:42.437086 [INFO] agent: Started DNS server 127.0.0.1:34775 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:42.437159 [INFO] agent: Started DNS server 127.0.0.1:34775 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:42.439124 [INFO] consul: Adding LAN server Node 5a6c8e34-0535-5d63-7e79-e0f886da3551 (Addr: tcp/127.0.0.1:34780) (DC: dc1)
TestDNS_SOA_Settings - 2019/12/06 06:05:42.441053 [INFO] agent: Started HTTP server on 127.0.0.1:34776 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:42.441174 [INFO] agent: started state syncer
2019/12/06 06:05:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:42 [INFO]  raft: Node at 127.0.0.1:34780 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:05:42.617194 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_ReverseLookup - 2019/12/06 06:05:42.829535 [INFO] agent: Synced node info
TestDNS_ReverseLookup - 2019/12/06 06:05:42.829644 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:05:42.829952 [DEBUG] consul: Skipping self join check for "Node 7d555e7b-b226-ede2-fc13-7639f5dd2636" since the cluster is too small
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:42.833799 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (1.264363ms) from client 127.0.0.1:36725 (udp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:42.835320 [INFO] agent: Requesting shutdown
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:42.835532 [INFO] consul: shutting down server
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:42.835671 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.128436 [WARN] serf: Shutdown without a Leave
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.129158 [INFO] manager: shutting down
2019/12/06 06:05:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e2f7c3a2-7591-1a27-bed0-1695cd141eef Address:127.0.0.1:34792}]
TestDNS_EDNS0 - 2019/12/06 06:05:43.307686 [INFO] serf: EventMemberJoin: Node e2f7c3a2-7591-1a27-bed0-1695cd141eef.dc1 127.0.0.1
TestDNS_EDNS0 - 2019/12/06 06:05:43.312888 [INFO] serf: EventMemberJoin: Node e2f7c3a2-7591-1a27-bed0-1695cd141eef 127.0.0.1
TestDNS_EDNS0 - 2019/12/06 06:05:43.314069 [INFO] agent: Started DNS server 127.0.0.1:34787 (udp)
2019/12/06 06:05:43 [INFO]  raft: Node at 127.0.0.1:34792 [Follower] entering Follower state (Leader: "")
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.315198 [INFO] agent: consul server down
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.315256 [INFO] agent: shutdown complete
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.315315 [INFO] agent: Stopping DNS server 127.0.0.1:34763 (tcp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.315455 [INFO] agent: Stopping DNS server 127.0.0.1:34763 (udp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.315595 [INFO] agent: Stopping HTTP server 127.0.0.1:34764 (tcp)
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.315802 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.315874 [INFO] agent: Endpoints down
--- PASS: TestDNS_ReverseLookup_CustomDomain (16.16s)
=== CONT  TestDNS_NodeLookup_CNAME
TestDNS_EDNS0 - 2019/12/06 06:05:43.318331 [INFO] consul: Adding LAN server Node e2f7c3a2-7591-1a27-bed0-1695cd141eef (Addr: tcp/127.0.0.1:34792) (DC: dc1)
TestDNS_EDNS0 - 2019/12/06 06:05:43.318539 [INFO] consul: Handled member-join event for server "Node e2f7c3a2-7591-1a27-bed0-1695cd141eef.dc1" in area "wan"
TestDNS_EDNS0 - 2019/12/06 06:05:43.319277 [INFO] agent: Started DNS server 127.0.0.1:34787 (tcp)
TestDNS_EDNS0 - 2019/12/06 06:05:43.321516 [INFO] agent: Started HTTP server on 127.0.0.1:34788 (tcp)
TestDNS_EDNS0 - 2019/12/06 06:05:43.321599 [INFO] agent: started state syncer
TestDNS_ReverseLookup_CustomDomain - 2019/12/06 06:05:43.321888 [ERR] consul: failed to establish leadership: leadership lost while committing log
2019/12/06 06:05:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:43 [INFO]  raft: Node at 127.0.0.1:34792 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:43.422288 [WARN] agent: Node name "Node 9748613b-cde4-4e12-aa99-18dff0040283" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:43.422920 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:43.425788 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:43.617264 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:44 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:44 [INFO]  raft: Node at 127.0.0.1:34780 [Leader] entering Leader state
TestDNS_SOA_Settings - 2019/12/06 06:05:44.007095 [INFO] consul: cluster leadership acquired
TestDNS_SOA_Settings - 2019/12/06 06:05:44.007544 [INFO] consul: New leader elected: Node 5a6c8e34-0535-5d63-7e79-e0f886da3551
2019/12/06 06:05:44 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:44 [INFO]  raft: Node at 127.0.0.1:34792 [Leader] entering Leader state
TestCoordinate_Update - 2019/12/06 06:05:44.507571 [INFO] agent: Synced node info
TestCoordinate_Update - 2019/12/06 06:05:44.507709 [DEBUG] agent: Node info in sync
TestDNS_EDNS0 - 2019/12/06 06:05:44.508262 [INFO] consul: cluster leadership acquired
TestDNS_EDNS0 - 2019/12/06 06:05:44.508726 [INFO] consul: New leader elected: Node e2f7c3a2-7591-1a27-bed0-1695cd141eef
TestEventFire_token - 2019/12/06 06:05:44.617226 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:44.665895 [INFO] agent: Synced node info
TestDNS_SOA_Settings - 2019/12/06 06:05:44.666033 [DEBUG] agent: Node info in sync
TestDNS_SOA_Settings - 2019/12/06 06:05:44.678529 [DEBUG] dns: request for name nofoo.node.dc1.consul. type ANY class IN (took 638.015µs) from client 127.0.0.1:60604 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:44.679125 [INFO] agent: Requesting shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:44.679267 [INFO] consul: shutting down server
TestDNS_SOA_Settings - 2019/12/06 06:05:44.679319 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/06 06:05:44.833135 [WARN] serf: Shutdown without a Leave
TestDNS_SOA_Settings - 2019/12/06 06:05:45.172330 [INFO] manager: shutting down
TestDNS_EDNS0 - 2019/12/06 06:05:45.173486 [INFO] agent: Synced node info
TestDNS_ReverseLookup - 2019/12/06 06:05:45.346977 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/12/06 06:05:45.347179 [DEBUG] consul: Skipping self join check for "Node a587e71c-195b-52ca-e2b1-0bac5467c444" since the cluster is too small
TestDNS_ReverseLookup - 2019/12/06 06:05:45.348327 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ReverseLookup - 2019/12/06 06:05:45.348856 [DEBUG] consul: Skipping self join check for "Node dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a" since the cluster is too small
TestDNS_ReverseLookup - 2019/12/06 06:05:45.349056 [INFO] consul: member 'Node dcaba9dd-69dc-ecc3-4a9a-f8efaf35bf1a' joined, marking health alive
TestCoordinate_Update - 2019/12/06 06:05:45.349715 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCoordinate_Update - 2019/12/06 06:05:45.350395 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCoordinate_Update - 2019/12/06 06:05:45.350804 [DEBUG] consul: Skipping self join check for "Node cbb60d59-2fe3-b186-e279-7d41b8096d95" since the cluster is too small
TestCoordinate_Update - 2019/12/06 06:05:45.350962 [INFO] consul: member 'Node cbb60d59-2fe3-b186-e279-7d41b8096d95' joined, marking health alive
TestDNS_SOA_Settings - 2019/12/06 06:05:45.537261 [INFO] agent: consul server down
TestDNS_SOA_Settings - 2019/12/06 06:05:45.537348 [INFO] agent: shutdown complete
TestDNS_SOA_Settings - 2019/12/06 06:05:45.537421 [INFO] agent: Stopping DNS server 127.0.0.1:34775 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:45.537707 [INFO] agent: Stopping DNS server 127.0.0.1:34775 (udp)
TestDNS_SOA_Settings - 2019/12/06 06:05:45.537884 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_SOA_Settings - 2019/12/06 06:05:45.538014 [INFO] agent: Stopping HTTP server 127.0.0.1:34776 (tcp)
TestDNS_SOA_Settings - 2019/12/06 06:05:45.538202 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_SOA_Settings - 2019/12/06 06:05:45.538324 [INFO] agent: Waiting for endpoints to shut down
TestDNS_SOA_Settings - 2019/12/06 06:05:45.538398 [INFO] agent: Endpoints down
--- PASS: TestDNS_SOA_Settings (29.97s)
=== CONT  TestDNSCycleRecursorCheck
2019/12/06 06:05:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9748613b-cde4-4e12-aa99-18dff0040283 Address:127.0.0.1:34798}]
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.542974 [INFO] serf: EventMemberJoin: Node 9748613b-cde4-4e12-aa99-18dff0040283.dc1 127.0.0.1
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.546540 [INFO] serf: EventMemberJoin: Node 9748613b-cde4-4e12-aa99-18dff0040283 127.0.0.1
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.547352 [DEBUG] dns: recursor enabled
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.547858 [INFO] agent: Started DNS server 127.0.0.1:34793 (udp)
2019/12/06 06:05:45 [INFO]  raft: Node at 127.0.0.1:34798 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.549547 [INFO] consul: Adding LAN server Node 9748613b-cde4-4e12-aa99-18dff0040283 (Addr: tcp/127.0.0.1:34798) (DC: dc1)
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.549911 [INFO] consul: Handled member-join event for server "Node 9748613b-cde4-4e12-aa99-18dff0040283.dc1" in area "wan"
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.550197 [DEBUG] dns: recursor enabled
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.550480 [INFO] agent: Started DNS server 127.0.0.1:34793 (tcp)
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.552666 [INFO] agent: Started HTTP server on 127.0.0.1:34794 (tcp)
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:45.552748 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestDNSCycleRecursorCheck - 2019/12/06 06:05:45.615784 [WARN] agent: Node name "Node ef16c4cd-2fdd-66d9-3385-96d72ddac818" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNSCycleRecursorCheck - 2019/12/06 06:05:45.616386 [DEBUG] tlsutil: Update with version 1
TestEventFire_token - 2019/12/06 06:05:45.617162 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:45 [INFO]  raft: Node at 127.0.0.1:34798 [Candidate] entering Candidate state in term 2
TestDNSCycleRecursorCheck - 2019/12/06 06:05:45.619356 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_EDNS0 - 2019/12/06 06:05:46.114327 [DEBUG] dns: request for name foo.node.dc1.consul. type ANY class IN (took 589.681µs) from client 127.0.0.1:41999 (udp)
TestDNS_EDNS0 - 2019/12/06 06:05:46.114575 [INFO] agent: Requesting shutdown
TestDNS_EDNS0 - 2019/12/06 06:05:46.114637 [INFO] consul: shutting down server
TestDNS_EDNS0 - 2019/12/06 06:05:46.114679 [WARN] serf: Shutdown without a Leave
TestDNS_EDNS0 - 2019/12/06 06:05:46.256571 [WARN] serf: Shutdown without a Leave
TestDNS_EDNS0 - 2019/12/06 06:05:46.545265 [INFO] manager: shutting down
TestEventFire_token - 2019/12/06 06:05:46.617071 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_EDNS0 - 2019/12/06 06:05:46.772302 [INFO] agent: consul server down
TestDNS_EDNS0 - 2019/12/06 06:05:46.772380 [INFO] agent: shutdown complete
TestDNS_EDNS0 - 2019/12/06 06:05:46.772445 [INFO] agent: Stopping DNS server 127.0.0.1:34787 (tcp)
TestDNS_EDNS0 - 2019/12/06 06:05:46.772626 [INFO] agent: Stopping DNS server 127.0.0.1:34787 (udp)
TestDNS_EDNS0 - 2019/12/06 06:05:46.772790 [INFO] agent: Stopping HTTP server 127.0.0.1:34788 (tcp)
TestDNS_EDNS0 - 2019/12/06 06:05:46.773037 [INFO] agent: Waiting for endpoints to shut down
TestDNS_EDNS0 - 2019/12/06 06:05:46.773112 [INFO] agent: Endpoints down
--- PASS: TestDNS_EDNS0 (9.27s)
=== CONT  TestDNS_NodeLookup_AAAA
TestDNS_EDNS0 - 2019/12/06 06:05:46.774237 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_EDNS0 - 2019/12/06 06:05:46.775046 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_EDNS0 - 2019/12/06 06:05:46.775220 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_EDNS0 - 2019/12/06 06:05:46.775371 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_EDNS0 - 2019/12/06 06:05:46.775544 [ERR] consul: failed to transfer leadership in 3 attempts
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:46.857048 [WARN] agent: Node name "Node 6284b2ec-975b-c115-4cde-307cdf6bad6f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:46.857765 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:46.860421 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:05:46 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:46 [INFO]  raft: Node at 127.0.0.1:34798 [Leader] entering Leader state
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:46.997877 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:46.998367 [INFO] consul: New leader elected: Node 9748613b-cde4-4e12-aa99-18dff0040283
TestEventFire_token - 2019/12/06 06:05:47.617265 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:05:47.797425 [DEBUG] consul: Skipping self join check for "Node 5b00a3f9-83b7-6ff9-5316-c7daa24e44b4" since the cluster is too small
2019/12/06 06:05:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0b43c023-9fbf-8988-754a-23a7e8386d43 Address:127.0.0.1:34786}]
2019/12/06 06:05:47 [INFO]  raft: Node at 127.0.0.1:34786 [Follower] entering Follower state (Leader: "")
TestDNS_EDNS0_ECS - 2019/12/06 06:05:47.803970 [INFO] serf: EventMemberJoin: Node 0b43c023-9fbf-8988-754a-23a7e8386d43.dc1 127.0.0.1
TestDNS_EDNS0_ECS - 2019/12/06 06:05:47.807060 [INFO] serf: EventMemberJoin: Node 0b43c023-9fbf-8988-754a-23a7e8386d43 127.0.0.1
TestDNS_EDNS0_ECS - 2019/12/06 06:05:47.808260 [INFO] agent: Started DNS server 127.0.0.1:34781 (udp)
TestDNS_EDNS0_ECS - 2019/12/06 06:05:47.808662 [INFO] consul: Adding LAN server Node 0b43c023-9fbf-8988-754a-23a7e8386d43 (Addr: tcp/127.0.0.1:34786) (DC: dc1)
TestDNS_EDNS0_ECS - 2019/12/06 06:05:47.808872 [INFO] consul: Handled member-join event for server "Node 0b43c023-9fbf-8988-754a-23a7e8386d43.dc1" in area "wan"
TestDNS_EDNS0_ECS - 2019/12/06 06:05:47.809467 [INFO] agent: Started DNS server 127.0.0.1:34781 (tcp)
TestDNS_EDNS0_ECS - 2019/12/06 06:05:47.812323 [INFO] agent: Started HTTP server on 127.0.0.1:34782 (tcp)
TestDNS_EDNS0_ECS - 2019/12/06 06:05:47.812433 [INFO] agent: started state syncer
2019/12/06 06:05:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:47 [INFO]  raft: Node at 127.0.0.1:34786 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:48.247383 [INFO] agent: Synced node info
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:48.247488 [DEBUG] agent: Node info in sync
2019/12/06 06:05:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ef16c4cd-2fdd-66d9-3385-96d72ddac818 Address:127.0.0.1:34804}]
2019/12/06 06:05:48 [INFO]  raft: Node at 127.0.0.1:34804 [Follower] entering Follower state (Leader: "")
2019/12/06 06:05:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:48 [INFO]  raft: Node at 127.0.0.1:34804 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:05:48.617157 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.835584 [INFO] serf: EventMemberJoin: Node ef16c4cd-2fdd-66d9-3385-96d72ddac818.dc1 127.0.0.1
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.838753 [INFO] serf: EventMemberJoin: Node ef16c4cd-2fdd-66d9-3385-96d72ddac818 127.0.0.1
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.839498 [INFO] consul: Handled member-join event for server "Node ef16c4cd-2fdd-66d9-3385-96d72ddac818.dc1" in area "wan"
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.839514 [INFO] consul: Adding LAN server Node ef16c4cd-2fdd-66d9-3385-96d72ddac818 (Addr: tcp/127.0.0.1:34804) (DC: dc1)
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.840062 [DEBUG] dns: recursor enabled
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.840104 [DEBUG] dns: recursor enabled
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.840402 [INFO] agent: Started DNS server 127.0.0.1:34799 (tcp)
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.840554 [INFO] agent: Started DNS server 127.0.0.1:34799 (udp)
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.842951 [INFO] agent: Started HTTP server on 127.0.0.1:34800 (tcp)
TestDNSCycleRecursorCheck - 2019/12/06 06:05:48.843092 [INFO] agent: started state syncer
jones - 2019/12/06 06:05:49.183972 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:05:49.184051 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/12/06 06:05:49.184090 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.349338 [DEBUG] dns: cname recurse RTT for www.google.com. (713.683µs)
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.349622 [DEBUG] dns: request for name google.node.consul. type ANY class IN (took 1.908711ms) from client 127.0.0.1:53122 (udp)
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.349957 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.350030 [INFO] consul: shutting down server
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.350073 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.496951 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:05:49.617095 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:05:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:49 [INFO]  raft: Node at 127.0.0.1:34786 [Leader] entering Leader state
TestDNS_EDNS0_ECS - 2019/12/06 06:05:49.737575 [INFO] consul: cluster leadership acquired
TestDNS_EDNS0_ECS - 2019/12/06 06:05:49.738028 [INFO] consul: New leader elected: Node 0b43c023-9fbf-8988-754a-23a7e8386d43
2019/12/06 06:05:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:49 [INFO]  raft: Node at 127.0.0.1:34804 [Leader] entering Leader state
TestDNSCycleRecursorCheck - 2019/12/06 06:05:49.738992 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.739159 [INFO] manager: shutting down
TestDNSCycleRecursorCheck - 2019/12/06 06:05:49.739460 [INFO] consul: New leader elected: Node ef16c4cd-2fdd-66d9-3385-96d72ddac818
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.937857 [INFO] agent: consul server down
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.937929 [INFO] agent: shutdown complete
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.938000 [INFO] agent: Stopping DNS server 127.0.0.1:34793 (tcp)
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.938197 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.938156 [INFO] agent: Stopping DNS server 127.0.0.1:34793 (udp)
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.938497 [INFO] agent: Stopping HTTP server 127.0.0.1:34794 (tcp)
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.938738 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_CNAME - 2019/12/06 06:05:49.938814 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_CNAME (6.62s)
=== CONT  TestDNS_NodeLookup_PeriodName
TestDNSCycleRecursorCheck - 2019/12/06 06:05:49.979022 [DEBUG] dns: recurse RTT for {google.com. 1 1} (416.343µs) Recursor queried: 127.0.0.1:38394 Status returned: SERVFAIL
TestDNSCycleRecursorCheck - 2019/12/06 06:05:49.980086 [DEBUG] dns: recurse RTT for {google.com. 1 1} (420.343µs) Recursor queried: 127.0.0.1:58534
TestDNSCycleRecursorCheck - 2019/12/06 06:05:49.980465 [DEBUG] dns: request for {google.com. 1 1} (udp) (2.327387ms) from client 127.0.0.1:50167 (udp)
TestDNSCycleRecursorCheck - 2019/12/06 06:05:49.980750 [INFO] agent: Requesting shutdown
TestDNSCycleRecursorCheck - 2019/12/06 06:05:49.980833 [INFO] consul: shutting down server
TestDNSCycleRecursorCheck - 2019/12/06 06:05:49.980884 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:05:49.994561 [WARN] agent: Node name "Node 7cbdc06f-53d1-6281-2829-aebb05611662" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:05:49.994932 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:05:49.996978 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:05:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6284b2ec-975b-c115-4cde-307cdf6bad6f Address:127.0.0.1:34810}]
2019/12/06 06:05:50 [INFO]  raft: Node at 127.0.0.1:34810 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:50.108839 [INFO] serf: EventMemberJoin: Node 6284b2ec-975b-c115-4cde-307cdf6bad6f.dc1 127.0.0.1
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:50.111972 [INFO] serf: EventMemberJoin: Node 6284b2ec-975b-c115-4cde-307cdf6bad6f 127.0.0.1
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:50.112877 [INFO] consul: Adding LAN server Node 6284b2ec-975b-c115-4cde-307cdf6bad6f (Addr: tcp/127.0.0.1:34810) (DC: dc1)
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:50.113156 [INFO] consul: Handled member-join event for server "Node 6284b2ec-975b-c115-4cde-307cdf6bad6f.dc1" in area "wan"
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:50.113492 [INFO] agent: Started DNS server 127.0.0.1:34805 (udp)
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:50.113647 [INFO] agent: Started DNS server 127.0.0.1:34805 (tcp)
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:50.116047 [INFO] agent: Started HTTP server on 127.0.0.1:34806 (tcp)
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:50.116247 [INFO] agent: started state syncer
2019/12/06 06:05:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:05:50 [INFO]  raft: Node at 127.0.0.1:34810 [Candidate] entering Candidate state in term 2
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.248735 [WARN] serf: Shutdown without a Leave
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.345257 [INFO] manager: shutting down
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.545290 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.545496 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.545558 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.545559 [INFO] agent: consul server down
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.545702 [INFO] agent: shutdown complete
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.545756 [INFO] agent: Stopping DNS server 127.0.0.1:34799 (tcp)
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.545923 [INFO] agent: Stopping DNS server 127.0.0.1:34799 (udp)
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.546116 [INFO] agent: Stopping HTTP server 127.0.0.1:34800 (tcp)
TestDNS_EDNS0_ECS - 2019/12/06 06:05:50.546221 [INFO] agent: Synced node info
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.546342 [INFO] agent: Waiting for endpoints to shut down
TestDNSCycleRecursorCheck - 2019/12/06 06:05:50.546420 [INFO] agent: Endpoints down
--- PASS: TestDNSCycleRecursorCheck (5.01s)
=== CONT  TestDNS_CaseInsensitiveNodeLookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:05:50.615835 [WARN] agent: Node name "Node 29adec63-9be9-f38d-a1f2-d8d003d52bfa" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:05:50.616327 [DEBUG] tlsutil: Update with version 1
TestEventFire_token - 2019/12/06 06:05:50.617131 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:05:50.618861 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:05:51.617097 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:05:51.660354 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:05:51.660423 [DEBUG] agent: Node info in sync
jones - 2019/12/06 06:05:51.771171 [DEBUG] consul: Skipping self join check for "Node 1cd382dc-1b57-ab4c-91ac-41ec3d3abb1e" since the cluster is too small
2019/12/06 06:05:51 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:05:51 [INFO]  raft: Node at 127.0.0.1:34810 [Leader] entering Leader state
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:51.945610 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:51.946037 [INFO] consul: New leader elected: Node 6284b2ec-975b-c115-4cde-307cdf6bad6f
TestDNS_EDNS0_ECS - 2019/12/06 06:05:52.180240 [DEBUG] agent: Node info in sync
TestDNS_EDNS0_ECS - 2019/12/06 06:05:52.180339 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:52.617119 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:53.213986 [INFO] agent: Synced node info
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:53.214112 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:53.617094 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_NodeLookup_AAAA - 2019/12/06 06:05:54.261850 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/12/06 06:05:54.617267 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:55.619770 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:56.617451 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:57.617082 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:58.617166 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:05:59.617265 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:00.617104 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:01.617312 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:02.617112 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:03.617098 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:04.617148 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:05.617154 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:06.617145 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_EDNS0_ECS - 2019/12/06 06:06:07.479277 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:06:07.617111 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:06:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7cbdc06f-53d1-6281-2829-aebb05611662 Address:127.0.0.1:34816}]
jones - 2019/12/06 06:06:07.703714 [DEBUG] consul: Skipping self join check for "Node 05379951-a361-1a11-bcaf-fc25c0b4fc85" since the cluster is too small
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:07.705743 [DEBUG] dns: request for name bar.node.consul. type AAAA class IN (took 619.681µs) from client 127.0.0.1:37240 (udp)
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:07.706033 [INFO] agent: Requesting shutdown
2019/12/06 06:06:07 [INFO]  raft: Node at 127.0.0.1:34816 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:07.706098 [INFO] consul: shutting down server
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:07.706142 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:07.708489 [INFO] serf: EventMemberJoin: Node 7cbdc06f-53d1-6281-2829-aebb05611662.dc1 127.0.0.1
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:07.714101 [INFO] serf: EventMemberJoin: Node 7cbdc06f-53d1-6281-2829-aebb05611662 127.0.0.1
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:07.715374 [INFO] consul: Adding LAN server Node 7cbdc06f-53d1-6281-2829-aebb05611662 (Addr: tcp/127.0.0.1:34816) (DC: dc1)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:07.715649 [INFO] consul: Handled member-join event for server "Node 7cbdc06f-53d1-6281-2829-aebb05611662.dc1" in area "wan"
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:07.716368 [INFO] agent: Started DNS server 127.0.0.1:34811 (tcp)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:07.716675 [INFO] agent: Started DNS server 127.0.0.1:34811 (udp)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:07.719258 [INFO] agent: Started HTTP server on 127.0.0.1:34812 (tcp)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:07.719399 [INFO] agent: started state syncer
2019/12/06 06:06:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:07 [INFO]  raft: Node at 127.0.0.1:34816 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:07.869815 [WARN] serf: Shutdown without a Leave
TestDNS_EDNS0_ECS - 2019/12/06 06:06:08.069027 [INFO] agent: Requesting shutdown
TestDNS_EDNS0_ECS - 2019/12/06 06:06:08.069648 [INFO] consul: shutting down server
TestDNS_EDNS0_ECS - 2019/12/06 06:06:08.069821 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:06:08.617287 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:09.617193 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_EDNS0_ECS - 2019/12/06 06:06:09.929487 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:09.930822 [INFO] manager: shutting down
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.512579 [INFO] manager: shutting down
jones - 2019/12/06 06:06:10.513548 [DEBUG] consul: Skipping self join check for "Node 48766921-a98a-d876-447b-181b21701746" since the cluster is too small
2019/12/06 06:06:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:29adec63-9be9-f38d-a1f2-d8d003d52bfa Address:127.0.0.1:34822}]
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.514491 [INFO] agent: consul server down
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.514562 [INFO] agent: shutdown complete
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.514626 [INFO] agent: Stopping DNS server 127.0.0.1:34781 (tcp)
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.514782 [INFO] agent: Stopping DNS server 127.0.0.1:34781 (udp)
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.514952 [INFO] agent: Stopping HTTP server 127.0.0.1:34782 (tcp)
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.515179 [INFO] agent: Waiting for endpoints to shut down
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.515263 [INFO] agent: Endpoints down
--- PASS: TestDNS_EDNS0_ECS (34.59s)
=== CONT  TestDNS_Over_TCP
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:10.518106 [INFO] serf: EventMemberJoin: Node 29adec63-9be9-f38d-a1f2-d8d003d52bfa.dc1 127.0.0.1
2019/12/06 06:06:10 [INFO]  raft: Node at 127.0.0.1:34822 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:10.522610 [INFO] agent: consul server down
TestDNS_EDNS0_ECS - 2019/12/06 06:06:10.522750 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:10.524773 [INFO] agent: shutdown complete
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:10.522690 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:10.524991 [INFO] agent: Stopping DNS server 127.0.0.1:34805 (tcp)
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:10.525395 [INFO] agent: Stopping DNS server 127.0.0.1:34805 (udp)
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:10.525625 [INFO] agent: Stopping HTTP server 127.0.0.1:34806 (tcp)
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:10.525892 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_AAAA - 2019/12/06 06:06:10.526031 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_AAAA (23.75s)
=== CONT  TestRecursorAddr
=== CONT  TestCoordinate_Update_ACLDeny
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:10.533147 [INFO] serf: EventMemberJoin: Node 29adec63-9be9-f38d-a1f2-d8d003d52bfa 127.0.0.1
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:10.534343 [INFO] agent: Started DNS server 127.0.0.1:34817 (udp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:10.535437 [INFO] consul: Adding LAN server Node 29adec63-9be9-f38d-a1f2-d8d003d52bfa (Addr: tcp/127.0.0.1:34822) (DC: dc1)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:10.535736 [INFO] consul: Handled member-join event for server "Node 29adec63-9be9-f38d-a1f2-d8d003d52bfa.dc1" in area "wan"
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:10.536261 [INFO] agent: Started DNS server 127.0.0.1:34817 (tcp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:10.538518 [INFO] agent: Started HTTP server on 127.0.0.1:34818 (tcp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:10.538737 [INFO] agent: started state syncer
2019/12/06 06:06:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:10 [INFO]  raft: Node at 127.0.0.1:34822 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Over_TCP - 2019/12/06 06:06:10.591517 [WARN] agent: Node name "Node 6c0b22ae-23d6-f73f-474f-68540b402588" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Over_TCP - 2019/12/06 06:06:10.591905 [DEBUG] tlsutil: Update with version 1
TestDNS_Over_TCP - 2019/12/06 06:06:10.594308 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:10.596441 [WARN] agent: Node name "Node 82212284-ca1e-6303-d5ed-041e9288db70" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:10.596880 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:10.599554 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:06:10.617046 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:06:11 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:11 [INFO]  raft: Node at 127.0.0.1:34816 [Leader] entering Leader state
--- PASS: TestRecursorAddr (0.00s)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:11.378557 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:11.378938 [INFO] consul: New leader elected: Node 7cbdc06f-53d1-6281-2829-aebb05611662
TestEventFire_token - 2019/12/06 06:06:11.676238 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:12.617183 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:13.617420 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:06:13 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:13 [INFO]  raft: Node at 127.0.0.1:34822 [Leader] entering Leader state
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:13.826131 [INFO] consul: cluster leadership acquired
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:13.826557 [INFO] consul: New leader elected: Node 29adec63-9be9-f38d-a1f2-d8d003d52bfa
TestEventFire_token - 2019/12/06 06:06:14.617070 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:15.617281 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:15.999326 [INFO] agent: Synced node info
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:15.999431 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:16.530094 [INFO] agent: Synced node info
TestEventFire_token - 2019/12/06 06:06:16.617219 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:16.626272 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:16.626834 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:16.658138 [DEBUG] dns: request for name foo.bar.node.consul. type ANY class IN (took 581.681µs) from client 127.0.0.1:39066 (udp)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:16.659346 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:16.659541 [INFO] consul: shutting down server
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:16.659675 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:16.660217 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:16.660943 [INFO] manager: shutting down
2019/12/06 06:06:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:82212284-ca1e-6303-d5ed-041e9288db70 Address:127.0.0.1:34834}]
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:16.816596 [INFO] serf: EventMemberJoin: Node 82212284-ca1e-6303-d5ed-041e9288db70.dc1 127.0.0.1
2019/12/06 06:06:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6c0b22ae-23d6-f73f-474f-68540b402588 Address:127.0.0.1:34828}]
TestDNS_Over_TCP - 2019/12/06 06:06:16.822755 [INFO] serf: EventMemberJoin: Node 6c0b22ae-23d6-f73f-474f-68540b402588.dc1 127.0.0.1
2019/12/06 06:06:16 [INFO]  raft: Node at 127.0.0.1:34834 [Follower] entering Follower state (Leader: "")
TestDNS_Over_TCP - 2019/12/06 06:06:16.828205 [INFO] serf: EventMemberJoin: Node 6c0b22ae-23d6-f73f-474f-68540b402588 127.0.0.1
2019/12/06 06:06:16 [INFO]  raft: Node at 127.0.0.1:34828 [Follower] entering Follower state (Leader: "")
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:16.830294 [INFO] serf: EventMemberJoin: Node 82212284-ca1e-6303-d5ed-041e9288db70 127.0.0.1
TestDNS_Over_TCP - 2019/12/06 06:06:16.831616 [INFO] agent: Started DNS server 127.0.0.1:34823 (udp)
TestDNS_Over_TCP - 2019/12/06 06:06:16.831811 [INFO] consul: Handled member-join event for server "Node 6c0b22ae-23d6-f73f-474f-68540b402588.dc1" in area "wan"
TestDNS_Over_TCP - 2019/12/06 06:06:16.831865 [INFO] consul: Adding LAN server Node 6c0b22ae-23d6-f73f-474f-68540b402588 (Addr: tcp/127.0.0.1:34828) (DC: dc1)
TestDNS_Over_TCP - 2019/12/06 06:06:16.832286 [INFO] agent: Started DNS server 127.0.0.1:34823 (tcp)
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:16.832678 [INFO] consul: Adding LAN server Node 82212284-ca1e-6303-d5ed-041e9288db70 (Addr: tcp/127.0.0.1:34834) (DC: dc1)
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:16.833152 [INFO] consul: Handled member-join event for server "Node 82212284-ca1e-6303-d5ed-041e9288db70.dc1" in area "wan"
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:16.833674 [INFO] agent: Started DNS server 127.0.0.1:34829 (tcp)
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:16.834590 [INFO] agent: Started DNS server 127.0.0.1:34829 (udp)
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:16.836934 [INFO] agent: Started HTTP server on 127.0.0.1:34830 (tcp)
TestDNS_Over_TCP - 2019/12/06 06:06:16.834720 [INFO] agent: Started HTTP server on 127.0.0.1:34824 (tcp)
TestDNS_Over_TCP - 2019/12/06 06:06:16.837185 [INFO] agent: started state syncer
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:16.837334 [INFO] agent: started state syncer
2019/12/06 06:06:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:16 [INFO]  raft: Node at 127.0.0.1:34834 [Candidate] entering Candidate state in term 2
2019/12/06 06:06:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:16 [INFO]  raft: Node at 127.0.0.1:34828 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.121017 [INFO] agent: consul server down
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.121112 [INFO] agent: shutdown complete
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.121178 [INFO] agent: Stopping DNS server 127.0.0.1:34811 (tcp)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.121352 [INFO] agent: Stopping DNS server 127.0.0.1:34811 (udp)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.121527 [INFO] agent: Stopping HTTP server 127.0.0.1:34812 (tcp)
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.121752 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.121823 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_PeriodName (27.18s)
=== CONT  TestCoordinate_Node
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.123042 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_PeriodName - 2019/12/06 06:06:17.123425 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Node - 2019/12/06 06:06:17.185728 [WARN] agent: Node name "Node 9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Node - 2019/12/06 06:06:17.186128 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Node - 2019/12/06 06:06:17.188521 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:06:17.619310 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestEventFire_token - 2019/12/06 06:06:18.619777 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
jones - 2019/12/06 06:06:18.877719 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/12/06 06:06:18.877798 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.258224 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.266973 [DEBUG] dns: request for name fOO.node.dc1.consul. type ANY class IN (took 454.677µs) from client 127.0.0.1:47335 (udp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.267214 [INFO] agent: Requesting shutdown
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.267273 [INFO] consul: shutting down server
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.267314 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.387419 [WARN] serf: Shutdown without a Leave
2019/12/06 06:06:19 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:19 [INFO]  raft: Node at 127.0.0.1:34828 [Leader] entering Leader state
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.388157 [INFO] manager: shutting down
TestDNS_Over_TCP - 2019/12/06 06:06:19.388580 [INFO] consul: cluster leadership acquired
TestDNS_Over_TCP - 2019/12/06 06:06:19.389019 [INFO] consul: New leader elected: Node 6c0b22ae-23d6-f73f-474f-68540b402588
2019/12/06 06:06:19 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:19 [INFO]  raft: Node at 127.0.0.1:34834 [Leader] entering Leader state
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:19.604149 [INFO] consul: cluster leadership acquired
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:19.604662 [INFO] consul: New leader elected: Node 82212284-ca1e-6303-d5ed-041e9288db70
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.605181 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.605736 [INFO] agent: consul server down
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.605789 [INFO] agent: shutdown complete
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.605823 [WARN] consul: error getting server health from "Node 29adec63-9be9-f38d-a1f2-d8d003d52bfa": rpc error making call: EOF
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.605839 [INFO] agent: Stopping DNS server 127.0.0.1:34817 (tcp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.606022 [INFO] agent: Stopping DNS server 127.0.0.1:34817 (udp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.606167 [INFO] agent: Stopping HTTP server 127.0.0.1:34818 (tcp)
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.606363 [INFO] agent: Waiting for endpoints to shut down
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:19.606423 [INFO] agent: Endpoints down
--- PASS: TestDNS_CaseInsensitiveNodeLookup (29.06s)
=== CONT  TestCoordinate_Nodes
TestEventFire_token - 2019/12/06 06:06:19.617638 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Nodes - 2019/12/06 06:06:19.662930 [WARN] agent: Node name "Node ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Nodes - 2019/12/06 06:06:19.663357 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:19.664576 [ERR] agent: failed to sync remote state: ACL not found
TestCoordinate_Nodes - 2019/12/06 06:06:19.665481 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:19.985516 [INFO] acl: initializing acls
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:20.258275 [WARN] consul: error getting server health from "Node 29adec63-9be9-f38d-a1f2-d8d003d52bfa": context deadline exceeded
TestDNS_CaseInsensitiveNodeLookup - 2019/12/06 06:06:20.258556 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDNS_Over_TCP - 2019/12/06 06:06:20.340549 [INFO] agent: Synced node info
TestDNS_Over_TCP - 2019/12/06 06:06:20.340679 [DEBUG] agent: Node info in sync
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:20.554362 [INFO] acl: initializing acls
TestEventFire_token - 2019/12/06 06:06:20.617197 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_Over_TCP - 2019/12/06 06:06:20.738206 [DEBUG] agent: Node info in sync
2019/12/06 06:06:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56 Address:127.0.0.1:34840}]
2019/12/06 06:06:20 [INFO]  raft: Node at 127.0.0.1:34840 [Follower] entering Follower state (Leader: "")
TestCoordinate_Node - 2019/12/06 06:06:20.803619 [INFO] serf: EventMemberJoin: Node 9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56.dc1 127.0.0.1
TestCoordinate_Node - 2019/12/06 06:06:20.811691 [INFO] serf: EventMemberJoin: Node 9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56 127.0.0.1
TestCoordinate_Node - 2019/12/06 06:06:20.812752 [INFO] consul: Handled member-join event for server "Node 9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56.dc1" in area "wan"
TestCoordinate_Node - 2019/12/06 06:06:20.812782 [INFO] consul: Adding LAN server Node 9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56 (Addr: tcp/127.0.0.1:34840) (DC: dc1)
TestCoordinate_Node - 2019/12/06 06:06:20.813931 [INFO] agent: Started DNS server 127.0.0.1:34835 (udp)
TestCoordinate_Node - 2019/12/06 06:06:20.814402 [INFO] agent: Started DNS server 127.0.0.1:34835 (tcp)
TestCoordinate_Node - 2019/12/06 06:06:20.816974 [INFO] agent: Started HTTP server on 127.0.0.1:34836 (tcp)
TestCoordinate_Node - 2019/12/06 06:06:20.817214 [INFO] agent: started state syncer
2019/12/06 06:06:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:20 [INFO]  raft: Node at 127.0.0.1:34840 [Candidate] entering Candidate state in term 2
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:20.972143 [INFO] consul: Created ACL 'global-management' policy
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:20.972267 [WARN] consul: Configuring a non-UUID master token is deprecated
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:20.973427 [INFO] consul: Created ACL 'global-management' policy
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:20.973492 [WARN] consul: Configuring a non-UUID master token is deprecated
TestDNS_Over_TCP - 2019/12/06 06:06:21.192053 [DEBUG] dns: request for name foo.node.dc1.consul. type ANY class IN (took 576.68µs) from client 127.0.0.1:43330 (tcp)
TestDNS_Over_TCP - 2019/12/06 06:06:21.194806 [INFO] agent: Requesting shutdown
TestDNS_Over_TCP - 2019/12/06 06:06:21.194883 [INFO] consul: shutting down server
TestDNS_Over_TCP - 2019/12/06 06:06:21.194930 [WARN] serf: Shutdown without a Leave
TestDNS_Over_TCP - 2019/12/06 06:06:21.556988 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:06:21.617306 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestDNS_Over_TCP - 2019/12/06 06:06:21.752259 [INFO] manager: shutting down
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:21.754384 [INFO] consul: Bootstrapped ACL master token from configuration
2019/12/06 06:06:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f Address:127.0.0.1:34846}]
TestDNS_Over_TCP - 2019/12/06 06:06:22.065029 [INFO] agent: consul server down
TestDNS_Over_TCP - 2019/12/06 06:06:22.065089 [INFO] agent: shutdown complete
TestDNS_Over_TCP - 2019/12/06 06:06:22.065144 [INFO] agent: Stopping DNS server 127.0.0.1:34823 (tcp)
TestDNS_Over_TCP - 2019/12/06 06:06:22.065280 [INFO] agent: Stopping DNS server 127.0.0.1:34823 (udp)
TestDNS_Over_TCP - 2019/12/06 06:06:22.065439 [INFO] agent: Stopping HTTP server 127.0.0.1:34824 (tcp)
TestDNS_Over_TCP - 2019/12/06 06:06:22.065649 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Over_TCP - 2019/12/06 06:06:22.065716 [INFO] agent: Endpoints down
--- PASS: TestDNS_Over_TCP (11.55s)
=== CONT  TestCoordinate_Disabled_Response
TestCoordinate_Nodes - 2019/12/06 06:06:22.068074 [INFO] serf: EventMemberJoin: Node ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f.dc1 127.0.0.1
TestDNS_Over_TCP - 2019/12/06 06:06:22.070395 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_Over_TCP - 2019/12/06 06:06:22.070584 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
2019/12/06 06:06:22 [INFO]  raft: Node at 127.0.0.1:34846 [Follower] entering Follower state (Leader: "")
TestDNS_Over_TCP - 2019/12/06 06:06:22.070641 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDNS_Over_TCP - 2019/12/06 06:06:22.070751 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDNS_Over_TCP - 2019/12/06 06:06:22.070797 [ERR] consul: failed to transfer leadership in 3 attempts
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.071073 [INFO] consul: Bootstrapped ACL master token from configuration
TestCoordinate_Nodes - 2019/12/06 06:06:22.075165 [INFO] serf: EventMemberJoin: Node ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f 127.0.0.1
TestCoordinate_Nodes - 2019/12/06 06:06:22.076373 [INFO] consul: Adding LAN server Node ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f (Addr: tcp/127.0.0.1:34846) (DC: dc1)
TestCoordinate_Nodes - 2019/12/06 06:06:22.076687 [INFO] consul: Handled member-join event for server "Node ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f.dc1" in area "wan"
TestCoordinate_Nodes - 2019/12/06 06:06:22.079278 [INFO] agent: Started DNS server 127.0.0.1:34841 (tcp)
TestCoordinate_Nodes - 2019/12/06 06:06:22.079393 [INFO] agent: Started DNS server 127.0.0.1:34841 (udp)
TestCoordinate_Nodes - 2019/12/06 06:06:22.082180 [INFO] agent: Started HTTP server on 127.0.0.1:34842 (tcp)
TestCoordinate_Nodes - 2019/12/06 06:06:22.082270 [INFO] agent: started state syncer
2019/12/06 06:06:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:22 [INFO]  raft: Node at 127.0.0.1:34846 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestCoordinate_Disabled_Response - 2019/12/06 06:06:22.134853 [WARN] agent: Node name "Node aa11554c-ff80-8e7b-e492-796dc88815f4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCoordinate_Disabled_Response - 2019/12/06 06:06:22.135348 [DEBUG] tlsutil: Update with version 1
TestCoordinate_Disabled_Response - 2019/12/06 06:06:22.138353 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:06:22 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:22 [INFO]  raft: Node at 127.0.0.1:34840 [Leader] entering Leader state
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.480985 [INFO] consul: Created ACL anonymous token from configuration
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.481243 [DEBUG] acl: transitioning out of legacy ACL mode
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.482107 [INFO] serf: EventMemberUpdate: Node 82212284-ca1e-6303-d5ed-041e9288db70
TestCoordinate_Node - 2019/12/06 06:06:22.482664 [INFO] consul: cluster leadership acquired
TestCoordinate_Node - 2019/12/06 06:06:22.483024 [INFO] consul: New leader elected: Node 9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.484734 [INFO] serf: EventMemberUpdate: Node 82212284-ca1e-6303-d5ed-041e9288db70.dc1
TestEventFire_token - 2019/12/06 06:06:22.617059 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.891332 [INFO] agent: Synced node info
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.891435 [DEBUG] agent: Node info in sync
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.891564 [INFO] consul: Created ACL anonymous token from configuration
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.892366 [INFO] serf: EventMemberUpdate: Node 82212284-ca1e-6303-d5ed-041e9288db70
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.892999 [INFO] serf: EventMemberUpdate: Node 82212284-ca1e-6303-d5ed-041e9288db70.dc1
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.910547 [DEBUG] consul: dropping node "Node 82212284-ca1e-6303-d5ed-041e9288db70" from result due to ACLs
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.910721 [INFO] agent: Requesting shutdown
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.910785 [INFO] consul: shutting down server
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:22.910827 [WARN] serf: Shutdown without a Leave
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.145736 [WARN] serf: Shutdown without a Leave
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.304147 [INFO] manager: shutting down
TestCoordinate_Node - 2019/12/06 06:06:23.305404 [INFO] agent: Synced node info
2019/12/06 06:06:23 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:23 [INFO]  raft: Node at 127.0.0.1:34846 [Leader] entering Leader state
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.309901 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.310183 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.310530 [INFO] agent: consul server down
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.310573 [INFO] agent: shutdown complete
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.310621 [INFO] agent: Stopping DNS server 127.0.0.1:34829 (tcp)
TestCoordinate_Nodes - 2019/12/06 06:06:23.310689 [INFO] consul: cluster leadership acquired
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.310739 [INFO] agent: Stopping DNS server 127.0.0.1:34829 (udp)
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.310879 [INFO] agent: Stopping HTTP server 127.0.0.1:34830 (tcp)
TestCoordinate_Nodes - 2019/12/06 06:06:23.311018 [INFO] consul: New leader elected: Node ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.311051 [INFO] agent: Waiting for endpoints to shut down
TestCoordinate_Update_ACLDeny - 2019/12/06 06:06:23.311109 [INFO] agent: Endpoints down
--- PASS: TestCoordinate_Update_ACLDeny (12.78s)
=== CONT  TestConnectCAConfig
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCAConfig - 2019/12/06 06:06:23.368860 [WARN] agent: Node name "Node d88342ae-66b3-9ecd-0cd5-79fdbe8b798a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCAConfig - 2019/12/06 06:06:23.369407 [DEBUG] tlsutil: Update with version 1
TestConnectCAConfig - 2019/12/06 06:06:23.372785 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/12/06 06:06:23.617040 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:06:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:aa11554c-ff80-8e7b-e492-796dc88815f4 Address:127.0.0.1:34852}]
TestCoordinate_Disabled_Response - 2019/12/06 06:06:24.160163 [INFO] serf: EventMemberJoin: Node aa11554c-ff80-8e7b-e492-796dc88815f4.dc1 127.0.0.1
TestCoordinate_Nodes - 2019/12/06 06:06:24.162408 [INFO] agent: Synced node info
2019/12/06 06:06:24 [INFO]  raft: Node at 127.0.0.1:34852 [Follower] entering Follower state (Leader: "")
TestCoordinate_Disabled_Response - 2019/12/06 06:06:24.165878 [INFO] serf: EventMemberJoin: Node aa11554c-ff80-8e7b-e492-796dc88815f4 127.0.0.1
TestCoordinate_Disabled_Response - 2019/12/06 06:06:24.166570 [INFO] consul: Adding LAN server Node aa11554c-ff80-8e7b-e492-796dc88815f4 (Addr: tcp/127.0.0.1:34852) (DC: dc1)
TestCoordinate_Disabled_Response - 2019/12/06 06:06:24.166789 [INFO] consul: Handled member-join event for server "Node aa11554c-ff80-8e7b-e492-796dc88815f4.dc1" in area "wan"
TestCoordinate_Disabled_Response - 2019/12/06 06:06:24.167303 [INFO] agent: Started DNS server 127.0.0.1:34847 (tcp)
TestCoordinate_Disabled_Response - 2019/12/06 06:06:24.167377 [INFO] agent: Started DNS server 127.0.0.1:34847 (udp)
TestCoordinate_Disabled_Response - 2019/12/06 06:06:24.176397 [INFO] agent: Started HTTP server on 127.0.0.1:34848 (tcp)
TestCoordinate_Disabled_Response - 2019/12/06 06:06:24.176492 [INFO] agent: started state syncer
2019/12/06 06:06:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:24 [INFO]  raft: Node at 127.0.0.1:34852 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/12/06 06:06:24.619421 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
2019/12/06 06:06:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d88342ae-66b3-9ecd-0cd5-79fdbe8b798a Address:127.0.0.1:34858}]
TestCoordinate_Node - 2019/12/06 06:06:25.205161 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCoordinate_Node - 2019/12/06 06:06:25.205728 [DEBUG] consul: Skipping self join check for "Node 9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56" since the cluster is too small
TestCoordinate_Node - 2019/12/06 06:06:25.206042 [INFO] consul: member 'Node 9e2f37bd-7bbb-9c71-e442-c7dcde3d7b56' joined, marking health alive
TestConnectCAConfig - 2019/12/06 06:06:25.208382 [INFO] serf: EventMemberJoin: Node d88342ae-66b3-9ecd-0cd5-79fdbe8b798a.dc1 127.0.0.1
TestConnectCAConfig - 2019/12/06 06:06:25.214042 [INFO] serf: EventMemberJoin: Node d88342ae-66b3-9ecd-0cd5-79fdbe8b798a 127.0.0.1
2019/12/06 06:06:25 [INFO]  raft: Node at 127.0.0.1:34858 [Follower] entering Follower state (Leader: "")
TestConnectCAConfig - 2019/12/06 06:06:25.215355 [INFO] consul: Adding LAN server Node d88342ae-66b3-9ecd-0cd5-79fdbe8b798a (Addr: tcp/127.0.0.1:34858) (DC: dc1)
TestConnectCAConfig - 2019/12/06 06:06:25.215524 [INFO] agent: Started DNS server 127.0.0.1:34853 (udp)
TestConnectCAConfig - 2019/12/06 06:06:25.215566 [INFO] consul: Handled member-join event for server "Node d88342ae-66b3-9ecd-0cd5-79fdbe8b798a.dc1" in area "wan"
TestConnectCAConfig - 2019/12/06 06:06:25.215934 [INFO] agent: Started DNS server 127.0.0.1:34853 (tcp)
TestConnectCAConfig - 2019/12/06 06:06:25.219806 [INFO] agent: Started HTTP server on 127.0.0.1:34854 (tcp)
TestConnectCAConfig - 2019/12/06 06:06:25.219924 [INFO] agent: started state syncer
2019/12/06 06:06:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:25 [INFO]  raft: Node at 127.0.0.1:34858 [Candidate] entering Candidate state in term 2
2019/12/06 06:06:25 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:25 [INFO]  raft: Node at 127.0.0.1:34852 [Leader] entering Leader state
TestCoordinate_Disabled_Response - 2019/12/06 06:06:25.339552 [INFO] consul: cluster leadership acquired
TestCoordinate_Disabled_Response - 2019/12/06 06:06:25.339936 [INFO] consul: New leader elected: Node aa11554c-ff80-8e7b-e492-796dc88815f4
TestEventFire_token - 2019/12/06 06:06:25.617392 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestCoordinate_Node - 2019/12/06 06:06:25.710433 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestCoordinate_Node - 2019/12/06 06:06:25.710511 [DEBUG] agent: Node info in sync
TestCoordinate_Node - 2019/12/06 06:06:25.710576 [DEBUG] agent: Node info in sync
TestCoordinate_Node - 2019/12/06 06:06:25.759546 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCoordinate_Disabled_Response - 2019/12/06 06:06:25.888564 [INFO] agent: Synced node info
TestCoordinate_Disabled_Response - 2019/12/06 06:06:25.888716 [DEBUG] agent: Node info in sync
TestCoordinate_Nodes - 2019/12/06 06:06:26.063510 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCoordinate_Nodes - 2019/12/06 06:06:26.063964 [DEBUG] consul: Skipping self join check for "Node ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f" since the cluster is too small
TestCoordinate_Nodes - 2019/12/06 06:06:26.064120 [INFO] consul: member 'Node ba6e5cc7-7d85-e6d0-d33f-9c22d1c8581f' joined, marking health alive
2019/12/06 06:06:26 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:26 [INFO]  raft: Node at 127.0.0.1:34858 [Leader] entering Leader state
TestConnectCAConfig - 2019/12/06 06:06:26.241563 [INFO] consul: cluster leadership acquired
TestConnectCAConfig - 2019/12/06 06:06:26.241979 [INFO] consul: New leader elected: Node d88342ae-66b3-9ecd-0cd5-79fdbe8b798a
TestCoordinate_Node - 2019/12/06 06:06:26.365026 [INFO] agent: Requesting shutdown
TestCoordinate_Node - 2019/12/06 06:06:26.365113 [INFO] consul: shutting down server
TestCoordinate_Node - 2019/12/06 06:06:26.365237 [WARN] serf: Shutdown without a Leave
TestCoordinate_Nodes - 2019/12/06 06:06:26.466811 [DEBUG] agent: Node info in sync
TestCoordinate_Nodes - 2019/12/06 06:06:26.466916 [DEBUG] agent: Node info in sync
TestCoordinate_Nodes - 2019/12/06 06:06:26.515072 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCoordinate_Node - 2019/12/06 06:06:26.515213 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/12/06 06:06:26.617043 [ERR] acl: failed to apply acl token upgrade batch: raft is already shutdown
TestCoordinate_Node - 2019/12/06 06:06:26.687597 [INFO] manager: shutting down
TestCoordinate_Node - 2019/12/06 06:06:26.688489 [INFO] agent: consul server down
TestCoordinate_Node - 2019/12/06 06:06:26.688559 [INFO] agent: shutdown complete
TestCoordinate_Node - 2019/12/06 06:06:26.688616 [INFO] agent: Stopping DNS server 127.0.0.1:34835 (tcp)
TestCoordinate_Node - 2019/12/06 06:06:26.688800 [INFO] agent: Stopping DNS server 127.0.0.1:34835 (udp)
TestCoordinate_Node - 2019/12/06 06:06:26.688964 [INFO] agent: Stopping HTTP server 127.0.0.1:34836 (tcp)
TestCoordinate_Node - 2019/12/06 06:06:26.689275 [INFO] agent: Waiting for endpoints to shut down
TestCoordinate_Node - 2019/12/06 06:06:26.689364 [INFO] agent: Endpoints down
--- FAIL: TestCoordinate_Node (9.57s)
panic: interface conversion: interface {} is nil, not structs.Coordinates [recovered]
	panic: interface conversion: interface {} is nil, not structs.Coordinates

goroutine 1269 [running]:
testing.tRunner.func1(0x4de81e0)
	/usr/lib/go-1.13/src/testing/testing.go:874 +0x360
panic(0x1a0a2c8, 0x60a4f20)
	/usr/lib/go-1.13/src/runtime/panic.go:679 +0x194
github.com/hashicorp/consul/agent.TestCoordinate_Node(0x4de81e0)
	/<<BUILDDIR>>/consul-1.5.2+dfsg1/_build/src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go:273 +0x117c
testing.tRunner(0x4de81e0, 0x1d9cd78)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac
FAIL	github.com/hashicorp/consul/agent	315.649s
=== RUN   TestAE_scaleFactor
=== PAUSE TestAE_scaleFactor
=== RUN   TestAE_Pause_nestedPauseResume
=== PAUSE TestAE_Pause_nestedPauseResume
=== RUN   TestAE_Pause_ResumeTriggersSyncChanges
--- PASS: TestAE_Pause_ResumeTriggersSyncChanges (0.00s)
=== RUN   TestAE_staggerDependsOnClusterSize
--- PASS: TestAE_staggerDependsOnClusterSize (0.00s)
=== RUN   TestAE_Run_SyncFullBeforeChanges
--- PASS: TestAE_Run_SyncFullBeforeChanges (0.00s)
=== RUN   TestAE_Run_Quit
=== RUN   TestAE_Run_Quit/Run_panics_without_ClusterSize
=== RUN   TestAE_Run_Quit/runFSM_quits
--- PASS: TestAE_Run_Quit (0.00s)
    --- PASS: TestAE_Run_Quit/Run_panics_without_ClusterSize (0.00s)
    --- PASS: TestAE_Run_Quit/runFSM_quits (0.00s)
=== RUN   TestAE_FSM
=== RUN   TestAE_FSM/fullSyncState
=== RUN   TestAE_FSM/fullSyncState/Paused_->_retryFullSyncState
=== RUN   TestAE_FSM/fullSyncState/SyncFull()_error_->_retryFullSyncState
[ERR] agent: failed to sync remote state: boom
=== RUN   TestAE_FSM/fullSyncState/SyncFull()_OK_->_partialSyncState
=== RUN   TestAE_FSM/retryFullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/shutdownEvent_->_doneState
=== RUN   TestAE_FSM/retryFullSyncState/syncFullNotifEvent_->_fullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/syncFullTimerEvent_->_fullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/invalid_event_->_panic_
=== RUN   TestAE_FSM/partialSyncState
=== RUN   TestAE_FSM/partialSyncState/shutdownEvent_->_doneState
=== RUN   TestAE_FSM/partialSyncState/syncFullNotifEvent_->_fullSyncState
=== RUN   TestAE_FSM/partialSyncState/syncFullTimerEvent_->_fullSyncState
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+Paused_->_partialSyncState
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_error_->_partialSyncState
[ERR] agent: failed to sync changes: boom
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_OK_->_partialSyncState
=== RUN   TestAE_FSM/partialSyncState/invalid_event_->_panic_
=== RUN   TestAE_FSM/invalid_state_->_panic_
--- PASS: TestAE_FSM (0.01s)
    --- PASS: TestAE_FSM/fullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/Paused_->_retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/SyncFull()_error_->_retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/SyncFull()_OK_->_partialSyncState (0.00s)
    --- PASS: TestAE_FSM/retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/shutdownEvent_->_doneState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/syncFullNotifEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/syncFullTimerEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/invalid_event_->_panic_ (0.00s)
    --- PASS: TestAE_FSM/partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/shutdownEvent_->_doneState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncFullNotifEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncFullTimerEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+Paused_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_error_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_OK_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/invalid_event_->_panic_ (0.00s)
    --- PASS: TestAE_FSM/invalid_state_->_panic_ (0.00s)
=== RUN   TestAE_RetrySyncFullEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_shutdownEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_shutdownEvent_during_FullNotif
=== RUN   TestAE_RetrySyncFullEvent/trigger_syncFullNotifEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_syncFullTimerEvent
--- PASS: TestAE_RetrySyncFullEvent (0.13s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_shutdownEvent (0.00s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_shutdownEvent_during_FullNotif (0.10s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_syncFullNotifEvent (0.01s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_syncFullTimerEvent (0.02s)
=== RUN   TestAE_SyncChangesEvent
=== RUN   TestAE_SyncChangesEvent/trigger_shutdownEvent
=== RUN   TestAE_SyncChangesEvent/trigger_shutdownEvent_during_FullNotif
=== RUN   TestAE_SyncChangesEvent/trigger_syncFullNotifEvent
=== RUN   TestAE_SyncChangesEvent/trigger_syncFullTimerEvent
=== RUN   TestAE_SyncChangesEvent/trigger_syncChangesNotifEvent
--- PASS: TestAE_SyncChangesEvent (0.13s)
    --- PASS: TestAE_SyncChangesEvent/trigger_shutdownEvent (0.00s)
    --- PASS: TestAE_SyncChangesEvent/trigger_shutdownEvent_during_FullNotif (0.10s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncFullNotifEvent (0.01s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncFullTimerEvent (0.02s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncChangesNotifEvent (0.00s)
=== CONT  TestAE_scaleFactor
=== CONT  TestAE_Pause_nestedPauseResume
=== RUN   TestAE_scaleFactor/100_nodes
=== RUN   TestAE_scaleFactor/200_nodes
--- PASS: TestAE_Pause_nestedPauseResume (0.00s)
=== RUN   TestAE_scaleFactor/1000_nodes
=== RUN   TestAE_scaleFactor/10000_nodes
--- PASS: TestAE_scaleFactor (0.00s)
    --- PASS: TestAE_scaleFactor/100_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/200_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/1000_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/10000_nodes (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/ae	0.338s
=== RUN   TestParseFlags
=== RUN   TestParseFlags/#00
=== RUN   TestParseFlags/-bind_a
=== RUN   TestParseFlags/-bootstrap
=== RUN   TestParseFlags/-bootstrap=true
=== RUN   TestParseFlags/-bootstrap=false
=== RUN   TestParseFlags/-config-file_a_-config-dir_b_-config-file_c_-config-dir_d
=== RUN   TestParseFlags/-datacenter_a
=== RUN   TestParseFlags/-dns-port_1
=== RUN   TestParseFlags/-grpc-port_1
=== RUN   TestParseFlags/-serf-lan-port_1
=== RUN   TestParseFlags/-serf-wan-port_1
=== RUN   TestParseFlags/-server-port_1
=== RUN   TestParseFlags/-join_a_-join_b
=== RUN   TestParseFlags/-node-meta_a:b_-node-meta_c:d
=== RUN   TestParseFlags/-bootstrap_true
--- PASS: TestParseFlags (0.04s)
    --- PASS: TestParseFlags/#00 (0.00s)
    --- PASS: TestParseFlags/-bind_a (0.00s)
    --- PASS: TestParseFlags/-bootstrap (0.00s)
    --- PASS: TestParseFlags/-bootstrap=true (0.00s)
    --- PASS: TestParseFlags/-bootstrap=false (0.00s)
    --- PASS: TestParseFlags/-config-file_a_-config-dir_b_-config-file_c_-config-dir_d (0.00s)
    --- PASS: TestParseFlags/-datacenter_a (0.00s)
    --- PASS: TestParseFlags/-dns-port_1 (0.00s)
    --- PASS: TestParseFlags/-grpc-port_1 (0.00s)
    --- PASS: TestParseFlags/-serf-lan-port_1 (0.00s)
    --- PASS: TestParseFlags/-serf-wan-port_1 (0.00s)
    --- PASS: TestParseFlags/-server-port_1 (0.00s)
    --- PASS: TestParseFlags/-join_a_-join_b (0.00s)
    --- PASS: TestParseFlags/-node-meta_a:b_-node-meta_c:d (0.00s)
    --- PASS: TestParseFlags/-bootstrap_true (0.00s)
=== RUN   TestMerge
=== RUN   TestMerge/top_level_fields
--- PASS: TestMerge (0.01s)
    --- PASS: TestMerge/top_level_fields (0.01s)
=== RUN   TestPatchSliceOfMaps
=== RUN   TestPatchSliceOfMaps/00:_{"a":{"b":"c"}}_->_{"a":{"b":"c"}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/01:_{"a":[{"b":"c"}]}_->_{"a":{"b":"c"}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/02:_{"a":[{"b":[{"c":"d"}]}]}_->_{"a":{"b":{"c":"d"}}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/03:_{"a":[{"b":"c"}]}_->_{"a":[{"b":"c"}]}_skip:_[a]
=== RUN   TestPatchSliceOfMaps/04:_{_____"services":_[______{_______"checks":_[________{_________"header":_[__________{"a":"b"}_________]________}_______]______}_____]____}_->_{_____"services":_[______{_______"checks":_[________{_________"header":_{"a":"b"}________}_______]______}_____]____}_skip:_[services_services.checks]
--- PASS: TestPatchSliceOfMaps (0.00s)
    --- PASS: TestPatchSliceOfMaps/00:_{"a":{"b":"c"}}_->_{"a":{"b":"c"}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/01:_{"a":[{"b":"c"}]}_->_{"a":{"b":"c"}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/02:_{"a":[{"b":[{"c":"d"}]}]}_->_{"a":{"b":{"c":"d"}}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/03:_{"a":[{"b":"c"}]}_->_{"a":[{"b":"c"}]}_skip:_[a] (0.00s)
    --- PASS: TestPatchSliceOfMaps/04:_{_____"services":_[______{_______"checks":_[________{_________"header":_[__________{"a":"b"}_________]________}_______]______}_____]____}_->_{_____"services":_[______{_______"checks":_[________{_________"header":_{"a":"b"}________}_______]______}_____]____}_skip:_[services_services.checks] (0.00s)
=== RUN   TestConfigFlagsAndEdgecases
=== RUN   TestConfigFlagsAndEdgecases/-advertise
=== RUN   TestConfigFlagsAndEdgecases/-advertise-wan
=== RUN   TestConfigFlagsAndEdgecases/-advertise_and_-advertise-wan
=== RUN   TestConfigFlagsAndEdgecases/-bind
=== RUN   TestConfigFlagsAndEdgecases/-bootstrap
=== RUN   TestConfigFlagsAndEdgecases/-bootstrap-expect
=== RUN   TestConfigFlagsAndEdgecases/-client
=== RUN   TestConfigFlagsAndEdgecases/-config-dir
=== RUN   TestConfigFlagsAndEdgecases/-config-file_json
=== RUN   TestConfigFlagsAndEdgecases/-config-file_hcl_and_json
=== RUN   TestConfigFlagsAndEdgecases/-data-dir_empty
=== RUN   TestConfigFlagsAndEdgecases/-data-dir_non-directory
=== RUN   TestConfigFlagsAndEdgecases/-datacenter
=== RUN   TestConfigFlagsAndEdgecases/-datacenter_empty
=== RUN   TestConfigFlagsAndEdgecases/-dev
=== RUN   TestConfigFlagsAndEdgecases/-disable-host-node-id
=== RUN   TestConfigFlagsAndEdgecases/-disable-keyring-file
=== RUN   TestConfigFlagsAndEdgecases/-dns-port
=== RUN   TestConfigFlagsAndEdgecases/-domain
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_DC
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_service
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain_can_be_prefixed_by_non-keywords
=== RUN   TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_DC#01
=== RUN   TestConfigFlagsAndEdgecases/-enable-script-checks
=== RUN   TestConfigFlagsAndEdgecases/-encrypt
=== RUN   TestConfigFlagsAndEdgecases/-config-format_disabled,_skip_unknown_files
=== RUN   TestConfigFlagsAndEdgecases/-config-format=json
=== RUN   TestConfigFlagsAndEdgecases/-config-format=hcl
=== RUN   TestConfigFlagsAndEdgecases/-config-format_invalid
=== RUN   TestConfigFlagsAndEdgecases/-http-port
=== RUN   TestConfigFlagsAndEdgecases/-join
=== RUN   TestConfigFlagsAndEdgecases/-join-wan
=== RUN   TestConfigFlagsAndEdgecases/-log-level
=== RUN   TestConfigFlagsAndEdgecases/-node
=== RUN   TestConfigFlagsAndEdgecases/-node-id
=== RUN   TestConfigFlagsAndEdgecases/-node-meta
=== RUN   TestConfigFlagsAndEdgecases/-non-voting-server
=== RUN   TestConfigFlagsAndEdgecases/-pid-file
=== RUN   TestConfigFlagsAndEdgecases/-protocol
=== RUN   TestConfigFlagsAndEdgecases/-raft-protocol
=== RUN   TestConfigFlagsAndEdgecases/-recursor
=== RUN   TestConfigFlagsAndEdgecases/-rejoin
=== RUN   TestConfigFlagsAndEdgecases/-retry-interval
=== RUN   TestConfigFlagsAndEdgecases/-retry-interval-wan
=== RUN   TestConfigFlagsAndEdgecases/-retry-join
=== RUN   TestConfigFlagsAndEdgecases/-retry-join-wan
=== RUN   TestConfigFlagsAndEdgecases/-retry-max
=== RUN   TestConfigFlagsAndEdgecases/-retry-max-wan
=== RUN   TestConfigFlagsAndEdgecases/-serf-lan-bind
=== RUN   TestConfigFlagsAndEdgecases/-serf-lan-port
=== RUN   TestConfigFlagsAndEdgecases/-serf-wan-bind
=== RUN   TestConfigFlagsAndEdgecases/-serf-wan-port
=== RUN   TestConfigFlagsAndEdgecases/-server
=== RUN   TestConfigFlagsAndEdgecases/-server-port
=== RUN   TestConfigFlagsAndEdgecases/-syslog
=== RUN   TestConfigFlagsAndEdgecases/-ui
=== RUN   TestConfigFlagsAndEdgecases/-ui-dir
=== RUN   TestConfigFlagsAndEdgecases/-ui-content-path
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v4
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v6
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_and_advertise_set_should_not_detect
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_and_advertise_set_should_not_detect
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_>_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_>_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:client_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:client,_address_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client,_address_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_lan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_wan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/json:allow_disabling_serf_wan_port
=== RUN   TestConfigFlagsAndEdgecases/hcl:allow_disabling_serf_wan_port
=== RUN   TestConfigFlagsAndEdgecases/json:serf_bind_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:serf_bind_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/json:serf_bind_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:serf_bind_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/json:dns_recursor_templates_with_deduplication
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_recursor_templates_with_deduplication
=== RUN   TestConfigFlagsAndEdgecases/json:start_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:start_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:start_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:start_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:retry_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:retry_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:retry_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:retry_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:precedence:_merge_order
=== RUN   TestConfigFlagsAndEdgecases/hcl:precedence:_merge_order
=== RUN   TestConfigFlagsAndEdgecases/json:precedence:_flag_before_file
=== RUN   TestConfigFlagsAndEdgecases/hcl:precedence:_flag_before_file
=== RUN   TestConfigFlagsAndEdgecases/json:raft_performance_scaling
=== RUN   TestConfigFlagsAndEdgecases/hcl:raft_performance_scaling
=== RUN   TestConfigFlagsAndEdgecases/json:invalid_input
=== RUN   TestConfigFlagsAndEdgecases/hcl:invalid_input
=== RUN   TestConfigFlagsAndEdgecases/json:datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/hcl:datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/json:acl_datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/json:acl_replication_token_enables_acl_replication
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_replication_token_enables_acl_replication
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v6
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v6
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v6
=== RUN   TestConfigFlagsAndEdgecases/ae_interval_invalid_==_0
=== RUN   TestConfigFlagsAndEdgecases/ae_interval_invalid_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:acl_datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:autopilot.max_trailing_logs_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:autopilot.max_trailing_logs_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_does_not_allow_multiple_addresses
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_does_not_allow_multiple_addresses
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_a_unix_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_a_unix_socket
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap_without_server
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap_without_server
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_without_server
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_without_server
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_dev_mode
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_dev_mode
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect=1_equals_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=1_equals_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect=2_warning
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=2_warning
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_>_2_but_even_warning
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_>_2_but_even_warning
=== RUN   TestConfigFlagsAndEdgecases/json:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly
=== RUN   TestConfigFlagsAndEdgecases/json:client_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/json:datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:dns_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/json:ui_and_ui_dir
=== RUN   TestConfigFlagsAndEdgecases/hcl:ui_and_ui_dir
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_addr_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_addr_any
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_addr_wan_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_addr_wan_any
=== RUN   TestConfigFlagsAndEdgecases/json:recursors_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:recursors_any
=== RUN   TestConfigFlagsAndEdgecases/json:dns_config.udp_answer_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_config.udp_answer_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:dns_config.a_record_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_config.a_record_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_>_10
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_>_10
=== RUN   TestConfigFlagsAndEdgecases/node_name_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_key_too_long
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_key_too_long
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_value_too_long
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_value_too_long
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_too_many_keys
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_too_many_keys
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_http
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_http
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_https
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_https
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_http_vs_https
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_http_vs_https
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_HTTP_vs_RPC
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_HTTP_vs_RPC
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_LAN
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_LAN
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_WAN
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_WAN
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_ID
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_ID
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_nested_sidecar
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_nested_sidecar
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_managed_proxy
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_managed_proxy
=== RUN   TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_must_start_with_+_or_-
=== RUN   TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_must_start_with_+_or_-
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_has_invalid_key
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_has_invalid_key
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_given_but_LAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_LAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_given_but_WAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_WAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/json:multiple_check_files
=== RUN   TestConfigFlagsAndEdgecases/hcl:multiple_check_files
=== RUN   TestConfigFlagsAndEdgecases/json:grpc_check
=== RUN   TestConfigFlagsAndEdgecases/hcl:grpc_check
=== RUN   TestConfigFlagsAndEdgecases/json:alias_check_with_no_node
=== RUN   TestConfigFlagsAndEdgecases/hcl:alias_check_with_no_node
=== RUN   TestConfigFlagsAndEdgecases/json:multiple_service_files
=== RUN   TestConfigFlagsAndEdgecases/hcl:multiple_service_files
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_key
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_key
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_value
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_value
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_many_meta
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_many_meta
=== RUN   TestConfigFlagsAndEdgecases/json:translated_keys
=== RUN   TestConfigFlagsAndEdgecases/hcl:translated_keys
=== RUN   TestConfigFlagsAndEdgecases/json:ignore_snapshot_agent_sub-object
=== RUN   TestConfigFlagsAndEdgecases/hcl:ignore_snapshot_agent_sub-object
=== RUN   TestConfigFlagsAndEdgecases/json:Service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/hcl:Service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/json:Multiple_service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/hcl:Multiple_service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_root
=== RUN   TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_root
=== RUN   TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_api_registration
=== RUN   TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_api_registration
=== RUN   TestConfigFlagsAndEdgecases/json:service.connectsidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/hcl:service.connectsidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/json:services.connect.sidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/hcl:services.connect.sidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/json:verify_server_hostname_implies_verify_outgoing
=== RUN   TestConfigFlagsAndEdgecases/hcl:verify_server_hostname_implies_verify_outgoing
=== RUN   TestConfigFlagsAndEdgecases/json:test_connect_vault_provider_configuration
=== RUN   TestConfigFlagsAndEdgecases/hcl:test_connect_vault_provider_configuration
=== RUN   TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_doesn't_parse
=== RUN   TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_doesn't_parse
=== RUN   TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_unknown_kind
=== RUN   TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_unknown_kind
=== RUN   TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_invalid
--- PASS: TestConfigFlagsAndEdgecases (30.20s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise-wan (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise_and_-advertise-wan (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-bind (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-bootstrap (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-bootstrap-expect (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-client (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-dir (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-file_json (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-file_hcl_and_json (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-data-dir_empty (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-data-dir_non-directory (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/-datacenter (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-datacenter_empty (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/-dev (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/-disable-host-node-id (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-disable-keyring-file (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-dns-port (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/-domain (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_DC (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_service (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain_can_be_prefixed_by_non-keywords (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-alt-domain_can't_be_prefixed_by_DC#01 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-enable-script-checks (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-encrypt (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format_disabled,_skip_unknown_files (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format=json (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format=hcl (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format_invalid (0.00s)
    --- PASS: TestConfigFlagsAndEdgecases/-http-port (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-join (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-join-wan (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-log-level (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-node (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-node-id (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/-node-meta (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-non-voting-server (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-pid-file (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-protocol (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-raft-protocol (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-recursor (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-rejoin (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-interval (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-interval-wan (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-join (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-join-wan (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-max (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-max-wan (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-lan-bind (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-lan-port (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-wan-bind (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-wan-port (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/-server (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-server-port (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-syslog (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/-ui (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-ui-dir (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/-ui-content-path (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_v4 (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v4 (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_v6 (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v6 (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_and_advertise_set_should_not_detect (0.22s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_and_advertise_set_should_not_detect (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_==_0 (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_==_0 (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_<_0 (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_<_0 (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_>_0 (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_>_0 (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_==_0 (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_==_0 (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_<_0 (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_<_0 (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_template_and_ports (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_template_and_ports (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client,_address_template_and_ports (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client,_address_template_and_ports (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_lan_template (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_template (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_wan_template (0.25s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_template (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_lan_with_ports (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_with_ports (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_wan_with_ports (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_with_ports (0.26s)
    --- PASS: TestConfigFlagsAndEdgecases/json:allow_disabling_serf_wan_port (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:allow_disabling_serf_wan_port (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/json:serf_bind_address_lan_template (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:serf_bind_address_lan_template (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:serf_bind_address_wan_template (0.27s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:serf_bind_address_wan_template (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_recursor_templates_with_deduplication (0.22s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_recursor_templates_with_deduplication (0.28s)
    --- PASS: TestConfigFlagsAndEdgecases/json:start_join_address_template (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:start_join_address_template (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:start_join_wan_address_template (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:start_join_wan_address_template (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:retry_join_address_template (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:retry_join_address_template (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:retry_join_wan_address_template (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:retry_join_wan_address_template (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:precedence:_merge_order (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:precedence:_merge_order (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:precedence:_flag_before_file (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:precedence:_flag_before_file (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:raft_performance_scaling (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:raft_performance_scaling (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:invalid_input (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:invalid_input (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/json:datacenter_is_lower-cased (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:datacenter_is_lower-cased (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_datacenter_is_lower-cased (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_datacenter_is_lower-cased (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_replication_token_enables_acl_replication (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_replication_token_enables_acl_replication (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v4 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v4 (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v4 (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v4 (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v4 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v4 (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v6 (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v6 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v6 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v6 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v6 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v6 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/ae_interval_invalid_==_0 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/ae_interval_invalid_<_0 (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_datacenter_invalid (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_datacenter_invalid (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:autopilot.max_trailing_logs_invalid (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:autopilot.max_trailing_logs_invalid (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_empty (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_empty (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_does_not_allow_multiple_addresses (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_does_not_allow_multiple_addresses (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_a_unix_socket (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_a_unix_socket (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap_without_server (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap_without_server (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_without_server (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_without_server (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_invalid (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_invalid (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_dev_mode (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_dev_mode (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_bootstrap (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_bootstrap (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect=1_equals_bootstrap (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=1_equals_bootstrap (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect=2_warning (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=2_warning (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_>_2_but_even_warning (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_>_2_but_even_warning (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_does_not_allow_socket (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_does_not_allow_socket (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:datacenter_invalid (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:datacenter_invalid (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_does_not_allow_socket (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_does_not_allow_socket (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ui_and_ui_dir (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ui_and_ui_dir (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_addr_any (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_addr_any (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_addr_wan_any (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_addr_wan_any (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:recursors_any (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:recursors_any (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_config.udp_answer_limit_invalid (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_config.udp_answer_limit_invalid (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_config.a_record_limit_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_config.a_record_limit_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_<_0 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_<_0 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_==_0 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_==_0 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_>_10 (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_>_10 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/node_name_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_key_too_long (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_key_too_long (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_value_too_long (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_value_too_long (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_too_many_keys (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_too_many_keys (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_http (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_http (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_https (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_https (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_http_vs_https (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_http_vs_https (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_HTTP_vs_RPC (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_HTTP_vs_RPC (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_LAN (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_LAN (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_WAN (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_WAN (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_ID (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_ID (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_nested_sidecar (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_nested_sidecar (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_managed_proxy (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_managed_proxy (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_cannot_be_empty (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_cannot_be_empty (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_must_start_with_+_or_- (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_must_start_with_+_or_- (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_has_invalid_key (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_has_invalid_key (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_given_but_LAN_keyring_exists (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_LAN_keyring_exists (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_given_but_WAN_keyring_exists (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_WAN_keyring_exists (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:multiple_check_files (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:multiple_check_files (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:grpc_check (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:grpc_check (0.27s)
    --- PASS: TestConfigFlagsAndEdgecases/json:alias_check_with_no_node (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:alias_check_with_no_node (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/json:multiple_service_files (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:multiple_service_files (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_key (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_key (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_value (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_value (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_many_meta (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_many_meta (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:translated_keys (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:translated_keys (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ignore_snapshot_agent_sub-object (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ignore_snapshot_agent_sub-object (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:Service_managed_proxy_'upstreams' (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:Service_managed_proxy_'upstreams' (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:Multiple_service_managed_proxy_'upstreams' (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:Multiple_service_managed_proxy_'upstreams' (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_root (0.25s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_root (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_api_registration (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_api_registration (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service.connectsidecar_service_with_checks_and_upstreams (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service.connectsidecar_service_with_checks_and_upstreams (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:services.connect.sidecar_service_with_checks_and_upstreams (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:services.connect.sidecar_service_with_checks_and_upstreams (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:verify_server_hostname_implies_verify_outgoing (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:verify_server_hostname_implies_verify_outgoing (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:test_connect_vault_provider_configuration (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:test_connect_vault_provider_configuration (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_doesn't_parse (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_doesn't_parse (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_unknown_kind (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_unknown_kind (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ConfigEntry_bootstrap_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ConfigEntry_bootstrap_invalid (0.04s)
=== RUN   TestFullConfig
=== RUN   TestFullConfig/json
=== RUN   TestFullConfig/hcl
--- PASS: TestFullConfig (0.48s)
    runtime_test.go:4873: "RuntimeConfig.ACLEnableKeyListPolicy" is zero value
    --- PASS: TestFullConfig/json (0.24s)
    --- PASS: TestFullConfig/hcl (0.24s)
=== RUN   TestNonZero
=== RUN   TestNonZero/nil
=== RUN   TestNonZero/zero_bool
=== RUN   TestNonZero/zero_string
=== RUN   TestNonZero/zero_int
=== RUN   TestNonZero/zero_int8
=== RUN   TestNonZero/zero_int16
=== RUN   TestNonZero/zero_int32
=== RUN   TestNonZero/zero_int64
=== RUN   TestNonZero/zero_uint
=== RUN   TestNonZero/zero_uint8
=== RUN   TestNonZero/zero_uint16
=== RUN   TestNonZero/zero_uint32
=== RUN   TestNonZero/zero_uint64
=== RUN   TestNonZero/zero_float32
=== RUN   TestNonZero/zero_float64
=== RUN   TestNonZero/ptr_to_zero_value
=== RUN   TestNonZero/empty_slice
=== RUN   TestNonZero/slice_with_zero_value
=== RUN   TestNonZero/empty_map
=== RUN   TestNonZero/map_with_zero_value_key
=== RUN   TestNonZero/map_with_zero_value_elem
=== RUN   TestNonZero/struct_with_nil_field
=== RUN   TestNonZero/struct_with_zero_value_field
=== RUN   TestNonZero/struct_with_empty_array
--- PASS: TestNonZero (0.02s)
    --- PASS: TestNonZero/nil (0.00s)
    --- PASS: TestNonZero/zero_bool (0.00s)
    --- PASS: TestNonZero/zero_string (0.00s)
    --- PASS: TestNonZero/zero_int (0.00s)
    --- PASS: TestNonZero/zero_int8 (0.00s)
    --- PASS: TestNonZero/zero_int16 (0.00s)
    --- PASS: TestNonZero/zero_int32 (0.00s)
    --- PASS: TestNonZero/zero_int64 (0.00s)
    --- PASS: TestNonZero/zero_uint (0.00s)
    --- PASS: TestNonZero/zero_uint8 (0.00s)
    --- PASS: TestNonZero/zero_uint16 (0.00s)
    --- PASS: TestNonZero/zero_uint32 (0.00s)
    --- PASS: TestNonZero/zero_uint64 (0.00s)
    --- PASS: TestNonZero/zero_float32 (0.00s)
    --- PASS: TestNonZero/zero_float64 (0.00s)
    --- PASS: TestNonZero/ptr_to_zero_value (0.00s)
    --- PASS: TestNonZero/empty_slice (0.00s)
    --- PASS: TestNonZero/slice_with_zero_value (0.00s)
    --- PASS: TestNonZero/empty_map (0.00s)
    --- PASS: TestNonZero/map_with_zero_value_key (0.00s)
    --- PASS: TestNonZero/map_with_zero_value_elem (0.00s)
    --- PASS: TestNonZero/struct_with_nil_field (0.00s)
    --- PASS: TestNonZero/struct_with_zero_value_field (0.00s)
    --- PASS: TestNonZero/struct_with_empty_array (0.00s)
=== RUN   TestConfigDecodeBytes
=== PAUSE TestConfigDecodeBytes
=== RUN   TestSanitize
--- PASS: TestSanitize (0.03s)
=== RUN   TestRuntime_apiAddresses
--- PASS: TestRuntime_apiAddresses (0.01s)
=== RUN   TestRuntime_APIConfigHTTPS
--- PASS: TestRuntime_APIConfigHTTPS (0.00s)
=== RUN   TestRuntime_APIConfigHTTP
--- PASS: TestRuntime_APIConfigHTTP (0.00s)
=== RUN   TestRuntime_APIConfigUNIX
--- PASS: TestRuntime_APIConfigUNIX (0.00s)
=== RUN   TestRuntime_APIConfigANYAddrV4
--- PASS: TestRuntime_APIConfigANYAddrV4 (0.00s)
=== RUN   TestRuntime_APIConfigANYAddrV6
--- PASS: TestRuntime_APIConfigANYAddrV6 (0.00s)
=== RUN   TestRuntime_ClientAddress
--- PASS: TestRuntime_ClientAddress (0.00s)
=== RUN   TestRuntime_ClientAddressAnyV4
--- PASS: TestRuntime_ClientAddressAnyV4 (0.00s)
=== RUN   TestRuntime_ClientAddressAnyV6
--- PASS: TestRuntime_ClientAddressAnyV6 (0.00s)
=== RUN   TestRuntime_ToTLSUtilConfig
--- PASS: TestRuntime_ToTLSUtilConfig (0.00s)
=== RUN   TestReadPath
=== RUN   TestReadPath/dir_skip_non_json_or_hcl_if_config-format_not_set
=== RUN   TestReadPath/dir_read_non_json_or_hcl_if_config-format_set
=== RUN   TestReadPath/file_skip_non_json_or_hcl_if_config-format_not_set
=== RUN   TestReadPath/file_read_non_json_or_hcl_if_config-format_set
--- PASS: TestReadPath (0.05s)
    --- PASS: TestReadPath/dir_skip_non_json_or_hcl_if_config-format_not_set (0.02s)
    --- PASS: TestReadPath/dir_read_non_json_or_hcl_if_config-format_set (0.01s)
    --- PASS: TestReadPath/file_skip_non_json_or_hcl_if_config-format_not_set (0.00s)
    --- PASS: TestReadPath/file_read_non_json_or_hcl_if_config-format_set (0.00s)
=== RUN   Test_UIPathBuilder
--- PASS: Test_UIPathBuilder (0.00s)
=== RUN   TestSegments
=== RUN   TestSegments/json:segment_name_not_in_OSS
=== RUN   TestSegments/hcl:segment_name_not_in_OSS
=== RUN   TestSegments/json:segment_port_must_be_set
=== RUN   TestSegments/hcl:segment_port_must_be_set
=== RUN   TestSegments/json:segments_not_in_OSS
=== RUN   TestSegments/hcl:segments_not_in_OSS
--- PASS: TestSegments (0.48s)
    --- PASS: TestSegments/json:segment_name_not_in_OSS (0.09s)
    --- PASS: TestSegments/hcl:segment_name_not_in_OSS (0.09s)
    --- PASS: TestSegments/json:segment_port_must_be_set (0.07s)
    --- PASS: TestSegments/hcl:segment_port_must_be_set (0.07s)
    --- PASS: TestSegments/json:segments_not_in_OSS (0.10s)
    --- PASS: TestSegments/hcl:segments_not_in_OSS (0.04s)
=== CONT  TestConfigDecodeBytes
--- PASS: TestConfigDecodeBytes (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/config	31.754s
=== RUN   TestCollectHostInfo
--- PASS: TestCollectHostInfo (0.17s)
PASS
ok  	github.com/hashicorp/consul/agent/debug	0.280s
?   	github.com/hashicorp/consul/agent/exec	[no test files]
=== RUN   TestAgentAntiEntropy_Services
--- SKIP: TestAgentAntiEntropy_Services (0.00s)
    state_test.go:30: DM-skipped
=== RUN   TestAgentAntiEntropy_Services_ConnectProxy
=== PAUSE TestAgentAntiEntropy_Services_ConnectProxy
=== RUN   TestAgent_ServiceWatchCh
=== PAUSE TestAgent_ServiceWatchCh
=== RUN   TestAgentAntiEntropy_EnableTagOverride
--- SKIP: TestAgentAntiEntropy_EnableTagOverride (0.00s)
    state_test.go:507: DM-skipped
=== RUN   TestAgentAntiEntropy_Services_WithChecks
=== PAUSE TestAgentAntiEntropy_Services_WithChecks
=== RUN   TestAgentAntiEntropy_Services_ACLDeny
=== PAUSE TestAgentAntiEntropy_Services_ACLDeny
=== RUN   TestAgentAntiEntropy_Checks
=== PAUSE TestAgentAntiEntropy_Checks
=== RUN   TestAgentAntiEntropy_Checks_ACLDeny
=== PAUSE TestAgentAntiEntropy_Checks_ACLDeny
=== RUN   TestAgent_UpdateCheck_DiscardOutput
--- SKIP: TestAgent_UpdateCheck_DiscardOutput (0.00s)
    state_test.go:1336: DM-skipped
=== RUN   TestAgentAntiEntropy_Check_DeferSync
--- SKIP: TestAgentAntiEntropy_Check_DeferSync (0.00s)
    state_test.go:1388: DM-skipped
=== RUN   TestAgentAntiEntropy_NodeInfo
=== PAUSE TestAgentAntiEntropy_NodeInfo
=== RUN   TestAgent_ServiceTokens
=== PAUSE TestAgent_ServiceTokens
=== RUN   TestAgent_CheckTokens
=== PAUSE TestAgent_CheckTokens
=== RUN   TestAgent_CheckCriticalTime
--- SKIP: TestAgent_CheckCriticalTime (0.00s)
    state_test.go:1707: DM-skipped
=== RUN   TestAgent_AddCheckFailure
=== PAUSE TestAgent_AddCheckFailure
=== RUN   TestAgent_AliasCheck
=== PAUSE TestAgent_AliasCheck
=== RUN   TestAgent_sendCoordinate
=== PAUSE TestAgent_sendCoordinate
=== RUN   TestState_Notify
=== PAUSE TestState_Notify
=== RUN   TestStateProxyManagement
=== PAUSE TestStateProxyManagement
=== RUN   TestStateProxyRestore
=== PAUSE TestStateProxyRestore
=== RUN   TestAliasNotifications_local
=== PAUSE TestAliasNotifications_local
=== CONT  TestAgentAntiEntropy_Services_ConnectProxy
=== CONT  TestAgent_AddCheckFailure
=== CONT  TestAgentAntiEntropy_Checks_ACLDeny
=== CONT  TestAgentAntiEntropy_Services_ACLDeny
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:07.472566 [WARN] agent: Node name "Node 08308563-4d89-0b55-0c2b-345747013020" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:07.474112 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:07.480874 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
--- PASS: TestAgent_AddCheckFailure (0.20s)
=== CONT  TestAgent_CheckTokens
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:07.539885 [WARN] agent: Node name "Node d8fcb043-3433-e338-9fab-803673f5fccc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:07.541485 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:07.546442 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
--- PASS: TestAgent_CheckTokens (0.11s)
=== CONT  TestAgent_ServiceTokens
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:07.607108 [WARN] agent: Node name "Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:07.607659 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:07.611119 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
--- PASS: TestAgent_ServiceTokens (0.07s)
=== CONT  TestAgentAntiEntropy_NodeInfo
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:07.775929 [WARN] agent: Node name "Node d771a3fa-29e1-b649-81af-548ce76461ac" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:07.776356 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:07.778455 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2f0cf84a-db82-00ce-eac7-76ed7ce9db68 Address:127.0.0.1:41518}]
2019/12/06 06:01:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d8fcb043-3433-e338-9fab-803673f5fccc Address:127.0.0.1:41512}]
2019/12/06 06:01:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:08308563-4d89-0b55-0c2b-345747013020 Address:127.0.0.1:41506}]
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:09.667258 [INFO] serf: EventMemberJoin: Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68.dc1 127.0.0.1
2019/12/06 06:01:09 [INFO]  raft: Node at 127.0.0.1:41512 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:09.673464 [INFO] serf: EventMemberJoin: Node d8fcb043-3433-e338-9fab-803673f5fccc.dc1 127.0.0.1
2019/12/06 06:01:09 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
2019/12/06 06:01:09 [INFO]  raft: Node at 127.0.0.1:41518 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:09.688084 [INFO] serf: EventMemberJoin: Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68 127.0.0.1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:09.691592 [INFO] consul: Adding LAN server Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68 (Addr: tcp/127.0.0.1:41518) (DC: dc1)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:09.707667 [INFO] serf: EventMemberJoin: Node d8fcb043-3433-e338-9fab-803673f5fccc 127.0.0.1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:09.708632 [INFO] serf: EventMemberJoin: Node 08308563-4d89-0b55-0c2b-345747013020.dc1 127.0.0.1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:09.714474 [INFO] serf: EventMemberJoin: Node 08308563-4d89-0b55-0c2b-345747013020 127.0.0.1
2019/12/06 06:01:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:09 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:09.720756 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:09.721031 [INFO] consul: Handled member-join event for server "Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68.dc1" in area "wan"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:09.721314 [INFO] consul: Adding LAN server Node d8fcb043-3433-e338-9fab-803673f5fccc (Addr: tcp/127.0.0.1:41512) (DC: dc1)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:09.721794 [INFO] consul: Handled member-join event for server "Node d8fcb043-3433-e338-9fab-803673f5fccc.dc1" in area "wan"
2019/12/06 06:01:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:09 [INFO]  raft: Node at 127.0.0.1:41512 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:09 [INFO]  raft: Node at 127.0.0.1:41518 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:09.739600 [INFO] consul: Handled member-join event for server "Node 08308563-4d89-0b55-0c2b-345747013020.dc1" in area "wan"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:09.754740 [INFO] agent: Started DNS server 127.0.0.1:41507 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:09.755599 [INFO] consul: Adding LAN server Node 08308563-4d89-0b55-0c2b-345747013020 (Addr: tcp/127.0.0.1:41506) (DC: dc1)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:09.755794 [INFO] agent: Started DNS server 127.0.0.1:41507 (udp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:09.756252 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:09.758322 [INFO] agent: Started HTTP server on 127.0.0.1:41508 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:09.758460 [INFO] agent: started state syncer
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:09.758487 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:09.758572 [INFO] agent: started state syncer
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:09.759595 [INFO] agent: Started DNS server 127.0.0.1:41513 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:09.762345 [INFO] agent: Started DNS server 127.0.0.1:41513 (udp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:09.764978 [INFO] agent: Started HTTP server on 127.0.0.1:41514 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:09.765510 [INFO] agent: started state syncer
2019/12/06 06:01:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:40e4a748-2192-161a-0510-9bf59fe950b5 Address:127.0.0.1:41525}]
2019/12/06 06:01:10 [INFO]  raft: Node at 127.0.0.1:41525 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.289735 [INFO] serf: EventMemberJoin: Node d771a3fa-29e1-b649-81af-548ce76461ac.dc1 127.0.0.1
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.293706 [INFO] serf: EventMemberJoin: Node d771a3fa-29e1-b649-81af-548ce76461ac 127.0.0.1
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.295084 [INFO] agent: Started DNS server 127.0.0.1:41520 (udp)
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.295372 [INFO] consul: Adding LAN server Node d771a3fa-29e1-b649-81af-548ce76461ac (Addr: tcp/127.0.0.1:41525) (DC: dc1)
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.295565 [INFO] consul: Handled member-join event for server "Node d771a3fa-29e1-b649-81af-548ce76461ac.dc1" in area "wan"
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.296052 [INFO] agent: Started DNS server 127.0.0.1:41520 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.298349 [INFO] agent: Started HTTP server on 127.0.0.1:41521 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.298466 [INFO] agent: started state syncer
2019/12/06 06:01:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:10 [INFO]  raft: Node at 127.0.0.1:41525 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:10 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
2019/12/06 06:01:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:10 [INFO]  raft: Node at 127.0.0.1:41518 [Leader] entering Leader state
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:10.751795 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:10.752257 [INFO] consul: New leader elected: Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68
2019/12/06 06:01:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:10 [INFO]  raft: Node at 127.0.0.1:41512 [Leader] entering Leader state
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:10.753177 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:10.753572 [INFO] consul: New leader elected: Node d8fcb043-3433-e338-9fab-803673f5fccc
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:10.753649 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:10.753942 [INFO] consul: New leader elected: Node 08308563-4d89-0b55-0c2b-345747013020
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:10.919275 [ERR] agent: failed to sync remote state: ACL not found
2019/12/06 06:01:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:10 [INFO]  raft: Node at 127.0.0.1:41525 [Leader] entering Leader state
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.946013 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:10.946398 [INFO] consul: New leader elected: Node d771a3fa-29e1-b649-81af-548ce76461ac
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:10.948597 [ERR] agent: failed to sync remote state: ACL not found
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:11.135493 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:11.135625 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:11.242800 [INFO] acl: initializing acls
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:11.244907 [INFO] acl: initializing acls
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:11.256428 [INFO] acl: initializing acls
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:11.276678 [INFO] acl: initializing acls
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:11.433814 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:11.602160 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:11.783931 [INFO] consul: Created ACL 'global-management' policy
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:11.784039 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:11.783950 [INFO] consul: Created ACL 'global-management' policy
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:11.784116 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:11.783985 [INFO] consul: Created ACL 'global-management' policy
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:11.784521 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:11.784043 [INFO] consul: Created ACL 'global-management' policy
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:11.784815 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:12.128716 [ERR] agent: failed to sync remote state: ACL not found
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:12.286340 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:12.408545 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:12.409095 [DEBUG] consul: Skipping self join check for "Node 08308563-4d89-0b55-0c2b-345747013020" since the cluster is too small
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:12.414353 [INFO] consul: member 'Node 08308563-4d89-0b55-0c2b-345747013020' joined, marking health alive
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:12.590770 [ERR] agent: failed to sync remote state: ACL not found
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:12.663119 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:12.663559 [INFO] agent: Synced node info
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:12.663837 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:12.663999 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:12.921396 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:12.921520 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:12.922320 [INFO] serf: EventMemberUpdate: Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:12.922922 [INFO] serf: EventMemberUpdate: Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68.dc1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:12.924566 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:12.925425 [INFO] serf: EventMemberUpdate: Node d8fcb043-3433-e338-9fab-803673f5fccc
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:12.926096 [INFO] serf: EventMemberUpdate: Node d8fcb043-3433-e338-9fab-803673f5fccc.dc1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:13.185063 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:13.185171 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:13.186162 [INFO] serf: EventMemberUpdate: Node d8fcb043-3433-e338-9fab-803673f5fccc
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:13.186831 [INFO] serf: EventMemberUpdate: Node d8fcb043-3433-e338-9fab-803673f5fccc.dc1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:13.187483 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:13.188478 [INFO] serf: EventMemberUpdate: Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:13.189593 [INFO] serf: EventMemberUpdate: Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68.dc1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:13.317797 [INFO] agent: Synced service "mysql-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:13.317892 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:13.321447 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:13.533810 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:13.637729 [INFO] agent: Synced node info
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:13.637851 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:13.830587 [INFO] agent: Synced service "redis-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:13.830654 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:13.831554 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:13.831618 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.200496 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.200995 [DEBUG] consul: Skipping self join check for "Node d771a3fa-29e1-b649-81af-548ce76461ac" since the cluster is too small
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.201198 [INFO] consul: member 'Node d771a3fa-29e1-b649-81af-548ce76461ac' joined, marking health alive
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.204700 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.205724 [INFO] agent: Synced node info
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.206042 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.206111 [INFO] consul: shutting down server
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.206158 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.210989 [WARN] consul: error getting server health from "Node d771a3fa-29e1-b649-81af-548ce76461ac": rpc error making call: EOF
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.349533 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:14.357134 [INFO] agent: Synced service "web-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:14.357233 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:14.358081 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:14.358150 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:14.475241 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:14.475743 [DEBUG] consul: Skipping self join check for "Node d8fcb043-3433-e338-9fab-803673f5fccc" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:14.475852 [INFO] consul: member 'Node d8fcb043-3433-e338-9fab-803673f5fccc' joined, marking health alive
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:14.475241 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:14.477506 [DEBUG] consul: Skipping self join check for "Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:14.477613 [INFO] consul: member 'Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68' joined, marking health alive
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.478095 [INFO] manager: shutting down
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.478580 [INFO] agent: consul server down
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.478641 [INFO] agent: shutdown complete
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.478703 [INFO] agent: Stopping DNS server 127.0.0.1:41520 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.478878 [INFO] agent: Stopping DNS server 127.0.0.1:41520 (udp)
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.479110 [INFO] agent: Stopping HTTP server 127.0.0.1:41521 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.479407 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:14.479497 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_NodeInfo (6.80s)
=== CONT  TestStateProxyManagement
--- PASS: TestStateProxyManagement (0.01s)
=== CONT  TestAliasNotifications_local
WARNING: bootstrap = true: do not enable unless necessary
TestAliasNotifications_local - 2019/12/06 06:01:14.617836 [WARN] agent: Node name "Node 898c163c-a1ce-5dac-1c7a-f6460a43a0f6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAliasNotifications_local - 2019/12/06 06:01:14.618318 [DEBUG] tlsutil: Update with version 1
TestAliasNotifications_local - 2019/12/06 06:01:14.620909 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:14.727286 [DEBUG] consul: Skipping self join check for "Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:14.727873 [DEBUG] consul: Skipping self join check for "Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:14.735830 [DEBUG] consul: Skipping self join check for "Node d8fcb043-3433-e338-9fab-803673f5fccc" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:14.736366 [DEBUG] consul: Skipping self join check for "Node d8fcb043-3433-e338-9fab-803673f5fccc" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:14.744009 [DEBUG] consul: dropping node "Node 2f0cf84a-db82-00ce-eac7-76ed7ce9db68" from result due to ACLs
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:14.744096 [DEBUG] consul: dropping node "Node d8fcb043-3433-e338-9fab-803673f5fccc" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:14.970505 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:14.977131 [WARN] agent: Service "mysql" registration blocked by ACLs
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:14.975490 [INFO] agent: Synced service "cache-proxy"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:14.974914 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_NodeInfo - 2019/12/06 06:01:15.204588 [WARN] consul: error getting server health from "Node d771a3fa-29e1-b649-81af-548ce76461ac": context deadline exceeded
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.418937 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.419011 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:15.419089 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.420630 [INFO] agent: Deregistered service "lb-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.420712 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.420757 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.420789 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.422840 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.422902 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.422940 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.425886 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.426297 [WARN] agent: Service "mysql" registration blocked by ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.616773 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:15.617035 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.618417 [INFO] agent: Deregistered service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:15.618707 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:15.618758 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.618943 [INFO] agent: Deregistered service "cache-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.618991 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.619373 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.619501 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.619551 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.619594 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:15.619866 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
2019/12/06 06:01:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:898c163c-a1ce-5dac-1c7a-f6460a43a0f6 Address:127.0.0.1:41531}]
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.825917 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.826477 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:15.826556 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.828202 [INFO] agent: Deregistered service "cache-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.828277 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.828686 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.828743 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.828783 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.828811 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:15.828854 [INFO] serf: EventMemberJoin: Node 898c163c-a1ce-5dac-1c7a-f6460a43a0f6.dc1 127.0.0.1
2019/12/06 06:01:15 [INFO]  raft: Node at 127.0.0.1:41531 [Follower] entering Follower state (Leader: "")
TestAliasNotifications_local - 2019/12/06 06:01:15.832499 [INFO] serf: EventMemberJoin: Node 898c163c-a1ce-5dac-1c7a-f6460a43a0f6 127.0.0.1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.832560 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.832785 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.832848 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.826563 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.833093 [WARN] serf: Shutdown without a Leave
TestAliasNotifications_local - 2019/12/06 06:01:15.834165 [INFO] agent: Started DNS server 127.0.0.1:41526 (udp)
TestAliasNotifications_local - 2019/12/06 06:01:15.835617 [INFO] consul: Adding LAN server Node 898c163c-a1ce-5dac-1c7a-f6460a43a0f6 (Addr: tcp/127.0.0.1:41531) (DC: dc1)
TestAliasNotifications_local - 2019/12/06 06:01:15.835892 [INFO] consul: Handled member-join event for server "Node 898c163c-a1ce-5dac-1c7a-f6460a43a0f6.dc1" in area "wan"
TestAliasNotifications_local - 2019/12/06 06:01:15.836452 [INFO] agent: Started DNS server 127.0.0.1:41526 (tcp)
TestAliasNotifications_local - 2019/12/06 06:01:15.838918 [INFO] agent: Started HTTP server on 127.0.0.1:41527 (tcp)
TestAliasNotifications_local - 2019/12/06 06:01:15.839040 [INFO] agent: started state syncer
2019/12/06 06:01:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:15 [INFO]  raft: Node at 127.0.0.1:41531 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:15.932787 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:15.933993 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:16.107957 [INFO] manager: shutting down
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:16.108867 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:16.109009 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:16.109071 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:16.109330 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:16.109524 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:16.109772 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_ConnectProxy - 2019/12/06 06:01:16.109860 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_ConnectProxy (8.81s)
=== CONT  TestStateProxyRestore
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:16.110682 [INFO] manager: shutting down
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:16.111371 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:16.111420 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:16.111468 [INFO] agent: Stopping DNS server 127.0.0.1:41513 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:16.111626 [INFO] agent: Stopping DNS server 127.0.0.1:41513 (udp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:16.111767 [INFO] agent: Stopping HTTP server 127.0.0.1:41514 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:16.111962 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_ACLDeny - 2019/12/06 06:01:16.112025 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_ACLDeny (8.81s)
=== CONT  TestAgentAntiEntropy_Checks
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.115167 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.116640 [WARN] agent: Check "mysql-check" registration blocked by ACLs
--- PASS: TestStateProxyRestore (0.01s)
=== CONT  TestAgentAntiEntropy_Services_WithChecks
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:16.172362 [WARN] agent: Node name "Node 0676901b-c0c8-836c-cadd-61755efd955b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:16.172853 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:16.177887 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:16.189875 [WARN] agent: Node name "Node 97305eeb-28be-e39f-868d-b79661033e70" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:16.191397 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:16.198072 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.398714 [INFO] agent: Synced check "api-check"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.398818 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.404428 [DEBUG] consul: dropping check "api-check" from result due to ACLs
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.404520 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
2019/12/06 06:01:16 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:16 [INFO]  raft: Node at 127.0.0.1:41531 [Leader] entering Leader state
TestAliasNotifications_local - 2019/12/06 06:01:16.628015 [INFO] consul: cluster leadership acquired
TestAliasNotifications_local - 2019/12/06 06:01:16.628423 [INFO] consul: New leader elected: Node 898c163c-a1ce-5dac-1c7a-f6460a43a0f6
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.630444 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.959552 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:16.960094 [WARN] agent: Check "mysql-check" registration blocked by ACLs
TestAliasNotifications_local - 2019/12/06 06:01:17.050732 [INFO] agent: Synced node info
TestAliasNotifications_local - 2019/12/06 06:01:17.050871 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.145725 [INFO] agent: Deregistered check "api-check"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.145798 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.146200 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.146282 [INFO] consul: shutting down server
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.146331 [WARN] serf: Shutdown without a Leave
2019/12/06 06:01:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0676901b-c0c8-836c-cadd-61755efd955b Address:127.0.0.1:41537}]
2019/12/06 06:01:17 [INFO]  raft: Node at 127.0.0.1:41537 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.230208 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:17.234817 [INFO] serf: EventMemberJoin: Node 0676901b-c0c8-836c-cadd-61755efd955b.dc1 127.0.0.1
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:17.238258 [INFO] serf: EventMemberJoin: Node 0676901b-c0c8-836c-cadd-61755efd955b 127.0.0.1
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:17.239541 [INFO] consul: Handled member-join event for server "Node 0676901b-c0c8-836c-cadd-61755efd955b.dc1" in area "wan"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:17.239806 [INFO] consul: Adding LAN server Node 0676901b-c0c8-836c-cadd-61755efd955b (Addr: tcp/127.0.0.1:41537) (DC: dc1)
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:17.240036 [INFO] agent: Started DNS server 127.0.0.1:41532 (tcp)
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:17.240302 [INFO] agent: Started DNS server 127.0.0.1:41532 (udp)
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:17.242653 [INFO] agent: Started HTTP server on 127.0.0.1:41533 (tcp)
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:17.242742 [INFO] agent: started state syncer
2019/12/06 06:01:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:17 [INFO]  raft: Node at 127.0.0.1:41537 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.350728 [INFO] manager: shutting down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.351533 [INFO] agent: consul server down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.351590 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.351649 [INFO] agent: Stopping DNS server 127.0.0.1:41507 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.351820 [INFO] agent: Stopping DNS server 127.0.0.1:41507 (udp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.352003 [INFO] agent: Stopping HTTP server 127.0.0.1:41508 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.352244 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/12/06 06:01:17.352326 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Checks_ACLDeny (10.05s)
=== CONT  TestAgent_ServiceWatchCh
2019/12/06 06:01:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:97305eeb-28be-e39f-868d-b79661033e70 Address:127.0.0.1:41543}]
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:17.445963 [INFO] serf: EventMemberJoin: Node 97305eeb-28be-e39f-868d-b79661033e70.dc1 127.0.0.1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:17.449710 [INFO] serf: EventMemberJoin: Node 97305eeb-28be-e39f-868d-b79661033e70 127.0.0.1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:17.451597 [INFO] agent: Started DNS server 127.0.0.1:41538 (udp)
2019/12/06 06:01:17 [INFO]  raft: Node at 127.0.0.1:41543 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:17.457323 [INFO] consul: Adding LAN server Node 97305eeb-28be-e39f-868d-b79661033e70 (Addr: tcp/127.0.0.1:41543) (DC: dc1)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:17.457710 [INFO] consul: Handled member-join event for server "Node 97305eeb-28be-e39f-868d-b79661033e70.dc1" in area "wan"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:17.457998 [INFO] agent: Started DNS server 127.0.0.1:41538 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:17.460452 [INFO] agent: Started HTTP server on 127.0.0.1:41539 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:17.460567 [INFO] agent: started state syncer
2019/12/06 06:01:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:17 [INFO]  raft: Node at 127.0.0.1:41543 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ServiceWatchCh - 2019/12/06 06:01:17.512730 [WARN] agent: Node name "Node db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_ServiceWatchCh - 2019/12/06 06:01:17.513208 [DEBUG] tlsutil: Update with version 1
TestAgent_ServiceWatchCh - 2019/12/06 06:01:17.515520 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:18 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:18 [INFO]  raft: Node at 127.0.0.1:41537 [Leader] entering Leader state
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:18.067462 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:18.067876 [INFO] consul: New leader elected: Node 0676901b-c0c8-836c-cadd-61755efd955b
TestAliasNotifications_local - 2019/12/06 06:01:18.274273 [DEBUG] agent: Node info in sync
2019/12/06 06:01:18 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:18 [INFO]  raft: Node at 127.0.0.1:41543 [Leader] entering Leader state
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:18.325113 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:18.325625 [INFO] consul: New leader elected: Node 97305eeb-28be-e39f-868d-b79661033e70
TestAliasNotifications_local - 2019/12/06 06:01:18.329652 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAliasNotifications_local - 2019/12/06 06:01:18.330216 [DEBUG] consul: Skipping self join check for "Node 898c163c-a1ce-5dac-1c7a-f6460a43a0f6" since the cluster is too small
TestAliasNotifications_local - 2019/12/06 06:01:18.330371 [INFO] consul: member 'Node 898c163c-a1ce-5dac-1c7a-f6460a43a0f6' joined, marking health alive
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:18.535167 [INFO] agent: Synced node info
2019/12/06 06:01:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0 Address:127.0.0.1:41549}]
TestAgent_ServiceWatchCh - 2019/12/06 06:01:18.738167 [INFO] serf: EventMemberJoin: Node db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0.dc1 127.0.0.1
2019/12/06 06:01:18 [INFO]  raft: Node at 127.0.0.1:41549 [Follower] entering Follower state (Leader: "")
TestAgent_ServiceWatchCh - 2019/12/06 06:01:18.757202 [INFO] serf: EventMemberJoin: Node db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0 127.0.0.1
TestAgent_ServiceWatchCh - 2019/12/06 06:01:18.764883 [INFO] agent: Started DNS server 127.0.0.1:41544 (udp)
TestAgent_ServiceWatchCh - 2019/12/06 06:01:18.770304 [INFO] consul: Adding LAN server Node db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0 (Addr: tcp/127.0.0.1:41549) (DC: dc1)
TestAgent_ServiceWatchCh - 2019/12/06 06:01:18.772073 [INFO] consul: Handled member-join event for server "Node db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0.dc1" in area "wan"
TestAgent_ServiceWatchCh - 2019/12/06 06:01:18.777049 [INFO] agent: Started DNS server 127.0.0.1:41544 (tcp)
TestAgent_ServiceWatchCh - 2019/12/06 06:01:18.779984 [INFO] agent: Started HTTP server on 127.0.0.1:41545 (tcp)
TestAgent_ServiceWatchCh - 2019/12/06 06:01:18.780092 [INFO] agent: started state syncer
2019/12/06 06:01:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:18 [INFO]  raft: Node at 127.0.0.1:41549 [Candidate] entering Candidate state in term 2
TestAliasNotifications_local - 2019/12/06 06:01:18.976588 [INFO] agent: Synced service "socat"
TestAliasNotifications_local - 2019/12/06 06:01:19.181408 [INFO] agent: Synced service "socat-sidecar-proxy"
TestAliasNotifications_local - 2019/12/06 06:01:19.181508 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.181553 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.181585 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.181731 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.181780 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.181824 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.181866 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.181895 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.182097 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.182159 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.182205 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.182246 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:19.261460 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:19.261579 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.366193 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAliasNotifications_local - 2019/12/06 06:01:19.367540 [INFO] agent: Synced check "service:socat-maintenance"
TestAliasNotifications_local - 2019/12/06 06:01:19.367598 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.368462 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.368522 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.368570 [DEBUG] agent: Check "service:socat-tcp" in sync
2019/12/06 06:01:19 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:19 [INFO]  raft: Node at 127.0.0.1:41549 [Leader] entering Leader state
TestAgent_ServiceWatchCh - 2019/12/06 06:01:19.477430 [INFO] consul: cluster leadership acquired
TestAgent_ServiceWatchCh - 2019/12/06 06:01:19.477836 [INFO] consul: New leader elected: Node db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0
TestAliasNotifications_local - 2019/12/06 06:01:19.639555 [INFO] agent: Synced check "service:socat-sidecar-proxy:2"
TestAliasNotifications_local - 2019/12/06 06:01:19.639643 [DEBUG] agent: Check "service:socat-maintenance" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.639680 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.640554 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.640617 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/06 06:01:19.640667 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:19.800256 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:19.800688 [DEBUG] consul: Skipping self join check for "Node 0676901b-c0c8-836c-cadd-61755efd955b" since the cluster is too small
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:19.800829 [INFO] consul: member 'Node 0676901b-c0c8-836c-cadd-61755efd955b' joined, marking health alive
TestAliasNotifications_local - 2019/12/06 06:01:19.801285 [INFO] agent: Synced check "service:socat-sidecar-proxy:2"
TestAgent_ServiceWatchCh - 2019/12/06 06:01:20.017153 [INFO] agent: Synced node info
TestAliasNotifications_local - 2019/12/06 06:01:20.109955 [INFO] agent: Deregistered check "service:socat-maintenance"
TestAliasNotifications_local - 2019/12/06 06:01:20.110033 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.110212 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.110271 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.110322 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.110365 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.110396 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.111306 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.111414 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:20.342236 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:20.342700 [DEBUG] consul: Skipping self join check for "Node 97305eeb-28be-e39f-868d-b79661033e70" since the cluster is too small
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:20.342863 [INFO] consul: member 'Node 97305eeb-28be-e39f-868d-b79661033e70' joined, marking health alive
TestAliasNotifications_local - 2019/12/06 06:01:20.345872 [INFO] agent: Synced check "service:socat-tcp"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:20.365023 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:20.365137 [DEBUG] agent: Check "mysql" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.612347 [INFO] agent: Synced check "service:socat-sidecar-proxy:2"
TestAliasNotifications_local - 2019/12/06 06:01:20.612803 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.613184 [INFO] agent: Requesting shutdown
TestAliasNotifications_local - 2019/12/06 06:01:20.613185 [DEBUG] agent: Service "socat" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.613634 [DEBUG] agent: Service "socat-sidecar-proxy" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.613805 [DEBUG] agent: Check "service:socat-tcp" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.613972 [DEBUG] agent: Check "service:socat-sidecar-proxy:2" in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.614126 [DEBUG] agent: Node info in sync
TestAliasNotifications_local - 2019/12/06 06:01:20.614533 [INFO] consul: shutting down server
TestAliasNotifications_local - 2019/12/06 06:01:20.614917 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:20.624809 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAliasNotifications_local - 2019/12/06 06:01:20.718109 [WARN] serf: Shutdown without a Leave
TestAliasNotifications_local - 2019/12/06 06:01:20.817318 [INFO] manager: shutting down
TestAliasNotifications_local - 2019/12/06 06:01:20.818207 [INFO] agent: consul server down
TestAliasNotifications_local - 2019/12/06 06:01:20.818275 [INFO] agent: shutdown complete
TestAliasNotifications_local - 2019/12/06 06:01:20.818334 [INFO] agent: Stopping DNS server 127.0.0.1:41526 (tcp)
TestAliasNotifications_local - 2019/12/06 06:01:20.818490 [INFO] agent: Stopping DNS server 127.0.0.1:41526 (udp)
TestAliasNotifications_local - 2019/12/06 06:01:20.818649 [INFO] agent: Stopping HTTP server 127.0.0.1:41527 (tcp)
TestAliasNotifications_local - 2019/12/06 06:01:20.818866 [INFO] agent: Waiting for endpoints to shut down
TestAliasNotifications_local - 2019/12/06 06:01:20.818947 [INFO] agent: Endpoints down
--- PASS: TestAliasNotifications_local (6.33s)
=== CONT  TestAgent_sendCoordinate
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:20.819725 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:20.821386 [INFO] agent: Synced check "redis"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:20.821446 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:20.821541 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:20.821590 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:20.821622 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_sendCoordinate - 2019/12/06 06:01:20.885132 [WARN] agent: Node name "Node 0ba4c23a-8067-c678-5014-89f39d82376d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_sendCoordinate - 2019/12/06 06:01:20.885536 [DEBUG] tlsutil: Update with version 1
TestAgent_sendCoordinate - 2019/12/06 06:01:20.887700 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:20.985311 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:20.985417 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:20.985457 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:20.986245 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.186821 [INFO] agent: Synced service "mysql"
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.190561 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.191523 [DEBUG] consul: Skipping self join check for "Node db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0" since the cluster is too small
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.191692 [INFO] consul: member 'Node db53f6bb-8b8f-47ba-6dbd-77c6f165c6f0' joined, marking health alive
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.293696 [INFO] agent: Synced check "web"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.293792 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.293837 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.293870 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.296549 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.296633 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.296675 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.296713 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.296743 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.298064 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.386767 [INFO] agent: Synced service "redis"
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.387074 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.387297 [DEBUG] agent: Check "redis:1" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.387471 [DEBUG] agent: Check "redis:2" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.387629 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.388094 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.388319 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.388535 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.391249 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.391964 [INFO] agent: Requesting shutdown
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.392052 [INFO] consul: shutting down server
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.392100 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.395796 [WARN] consul: error getting server health from "Node 97305eeb-28be-e39f-868d-b79661033e70": rpc error making call: EOF
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.466274 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.495567 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.500495 [INFO] agent: Deregistered check "lb"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.500620 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.500718 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.500809 [DEBUG] agent: Check "web" in sync
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.549936 [INFO] manager: shutting down
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.550452 [INFO] agent: consul server down
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.550503 [INFO] agent: shutdown complete
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.550560 [INFO] agent: Stopping DNS server 127.0.0.1:41544 (tcp)
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.550704 [INFO] agent: Stopping DNS server 127.0.0.1:41544 (udp)
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.550851 [INFO] agent: Stopping HTTP server 127.0.0.1:41545 (tcp)
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.551032 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ServiceWatchCh - 2019/12/06 06:01:21.551095 [INFO] agent: Endpoints down
--- PASS: TestAgent_ServiceWatchCh (4.20s)
=== CONT  TestState_Notify
--- PASS: TestState_Notify (0.00s)
=== CONT  TestAgent_AliasCheck
--- PASS: TestAgent_AliasCheck (0.04s)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.616291 [INFO] manager: shutting down
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.616852 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.616941 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.617004 [INFO] agent: Stopping DNS server 127.0.0.1:41538 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.617178 [INFO] agent: Stopping DNS server 127.0.0.1:41538 (udp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.617369 [INFO] agent: Stopping HTTP server 127.0.0.1:41539 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.617614 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:21.617687 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_WithChecks (5.50s)
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.684406 [INFO] agent: Synced check "cache"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.684490 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.685286 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.685345 [DEBUG] agent: Check "mysql" in sync
2019/12/06 06:01:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0ba4c23a-8067-c678-5014-89f39d82376d Address:127.0.0.1:41555}]
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.825950 [INFO] agent: Deregistered check "redis"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.826045 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.826087 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:21.826259 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
2019/12/06 06:01:21 [INFO]  raft: Node at 127.0.0.1:41555 [Follower] entering Follower state (Leader: "")
TestAgent_sendCoordinate - 2019/12/06 06:01:21.829345 [INFO] serf: EventMemberJoin: Node 0ba4c23a-8067-c678-5014-89f39d82376d.dc1 127.0.0.1
TestAgent_sendCoordinate - 2019/12/06 06:01:21.832666 [INFO] serf: EventMemberJoin: Node 0ba4c23a-8067-c678-5014-89f39d82376d 127.0.0.1
TestAgent_sendCoordinate - 2019/12/06 06:01:21.833188 [INFO] consul: Handled member-join event for server "Node 0ba4c23a-8067-c678-5014-89f39d82376d.dc1" in area "wan"
TestAgent_sendCoordinate - 2019/12/06 06:01:21.833498 [INFO] consul: Adding LAN server Node 0ba4c23a-8067-c678-5014-89f39d82376d (Addr: tcp/127.0.0.1:41555) (DC: dc1)
TestAgent_sendCoordinate - 2019/12/06 06:01:21.834053 [INFO] agent: Started DNS server 127.0.0.1:41550 (udp)
TestAgent_sendCoordinate - 2019/12/06 06:01:21.834130 [INFO] agent: Started DNS server 127.0.0.1:41550 (tcp)
TestAgent_sendCoordinate - 2019/12/06 06:01:21.836667 [INFO] agent: Started HTTP server on 127.0.0.1:41551 (tcp)
TestAgent_sendCoordinate - 2019/12/06 06:01:21.836916 [INFO] agent: started state syncer
2019/12/06 06:01:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:21 [INFO]  raft: Node at 127.0.0.1:41555 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.033421 [INFO] agent: Deregistered check "redis"
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.033530 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.033578 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.033678 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.176022 [INFO] agent: Synced node info
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.176464 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.176540 [INFO] consul: shutting down server
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.176590 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.242595 [WARN] serf: Shutdown without a Leave
2019/12/06 06:01:22 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:22 [INFO]  raft: Node at 127.0.0.1:41555 [Leader] entering Leader state
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.333022 [INFO] manager: shutting down
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.333975 [INFO] agent: consul server down
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.334124 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.334237 [INFO] agent: Stopping DNS server 127.0.0.1:41532 (tcp)
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.334457 [INFO] agent: Stopping DNS server 127.0.0.1:41532 (udp)
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.334620 [INFO] agent: Stopping HTTP server 127.0.0.1:41533 (tcp)
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.334834 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Checks - 2019/12/06 06:01:22.334914 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Checks (6.22s)
TestAgent_sendCoordinate - 2019/12/06 06:01:22.335024 [INFO] consul: cluster leadership acquired
TestAgent_sendCoordinate - 2019/12/06 06:01:22.335398 [INFO] consul: New leader elected: Node 0ba4c23a-8067-c678-5014-89f39d82376d
TestAgentAntiEntropy_Services_WithChecks - 2019/12/06 06:01:22.391425 [WARN] consul: error getting server health from "Node 97305eeb-28be-e39f-868d-b79661033e70": context deadline exceeded
TestAgent_sendCoordinate - 2019/12/06 06:01:22.642547 [INFO] agent: Synced node info
TestAgent_sendCoordinate - 2019/12/06 06:01:23.436919 [INFO] agent: Requesting shutdown
TestAgent_sendCoordinate - 2019/12/06 06:01:23.437017 [INFO] consul: shutting down server
TestAgent_sendCoordinate - 2019/12/06 06:01:23.437086 [WARN] serf: Shutdown without a Leave
TestAgent_sendCoordinate - 2019/12/06 06:01:23.611276 [WARN] serf: Shutdown without a Leave
TestAgent_sendCoordinate - 2019/12/06 06:01:23.783006 [INFO] manager: shutting down
TestAgent_sendCoordinate - 2019/12/06 06:01:23.812329 [ERR] agent: Coordinate update error: No cluster leader
TestAgent_sendCoordinate - 2019/12/06 06:01:23.866317 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestAgent_sendCoordinate - 2019/12/06 06:01:23.866582 [INFO] agent: consul server down
TestAgent_sendCoordinate - 2019/12/06 06:01:23.866634 [INFO] agent: shutdown complete
TestAgent_sendCoordinate - 2019/12/06 06:01:23.866695 [INFO] agent: Stopping DNS server 127.0.0.1:41550 (tcp)
TestAgent_sendCoordinate - 2019/12/06 06:01:23.866860 [INFO] agent: Stopping DNS server 127.0.0.1:41550 (udp)
TestAgent_sendCoordinate - 2019/12/06 06:01:23.867053 [INFO] agent: Stopping HTTP server 127.0.0.1:41551 (tcp)
TestAgent_sendCoordinate - 2019/12/06 06:01:23.867287 [INFO] agent: Waiting for endpoints to shut down
TestAgent_sendCoordinate - 2019/12/06 06:01:23.867366 [INFO] agent: Endpoints down
--- PASS: TestAgent_sendCoordinate (3.05s)
    state_test.go:1859: 10 1 100ms
PASS
ok  	github.com/hashicorp/consul/agent/local	16.992s
=== RUN   TestBuild
=== RUN   TestBuild/no_version
=== RUN   TestBuild/bad_version
=== RUN   TestBuild/good_version
=== RUN   TestBuild/rc_version
=== RUN   TestBuild/ent_version
--- PASS: TestBuild (0.01s)
    --- PASS: TestBuild/no_version (0.00s)
    --- PASS: TestBuild/bad_version (0.00s)
    --- PASS: TestBuild/good_version (0.00s)
    --- PASS: TestBuild/rc_version (0.00s)
    --- PASS: TestBuild/ent_version (0.00s)
=== RUN   TestServer_Key_Equal
--- PASS: TestServer_Key_Equal (0.00s)
=== RUN   TestServer_Key
--- PASS: TestServer_Key (0.00s)
=== RUN   TestServer_Key_params
--- PASS: TestServer_Key_params (0.00s)
=== RUN   TestIsConsulServer
--- PASS: TestIsConsulServer (0.00s)
=== RUN   TestIsConsulServer_Optional
--- PASS: TestIsConsulServer_Optional (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/metadata	0.077s
?   	github.com/hashicorp/consul/agent/mock	[no test files]
?   	github.com/hashicorp/consul/agent/pool	[no test files]
=== RUN   TestManager_BasicLifecycle
--- SKIP: TestManager_BasicLifecycle (0.00s)
    manager_test.go:42: DM-skipped
=== RUN   TestManager_deliverLatest
--- PASS: TestManager_deliverLatest (0.00s)
=== RUN   TestStateChanged
=== RUN   TestStateChanged/nil_node_service
=== RUN   TestStateChanged/same_service
=== RUN   TestStateChanged/same_service,_different_token
=== RUN   TestStateChanged/different_service_ID
=== RUN   TestStateChanged/different_address
=== RUN   TestStateChanged/different_port
=== RUN   TestStateChanged/different_service_kind
=== RUN   TestStateChanged/different_proxy_target
=== RUN   TestStateChanged/different_proxy_upstreams
--- PASS: TestStateChanged (0.01s)
    --- PASS: TestStateChanged/nil_node_service (0.00s)
    --- PASS: TestStateChanged/same_service (0.00s)
    --- PASS: TestStateChanged/same_service,_different_token (0.00s)
    --- PASS: TestStateChanged/different_service_ID (0.00s)
    --- PASS: TestStateChanged/different_address (0.00s)
    --- PASS: TestStateChanged/different_port (0.00s)
    --- PASS: TestStateChanged/different_service_kind (0.00s)
    --- PASS: TestStateChanged/different_proxy_target (0.00s)
    --- PASS: TestStateChanged/different_proxy_upstreams (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/proxycfg	0.254s
=== RUN   TestDaemon_impl
--- PASS: TestDaemon_impl (0.00s)
=== RUN   TestDaemonStartStop
=== PAUSE TestDaemonStartStop
=== RUN   TestDaemonRestart
=== PAUSE TestDaemonRestart
=== RUN   TestDaemonLaunchesNewProcessGroup
=== PAUSE TestDaemonLaunchesNewProcessGroup
=== RUN   TestDaemonStop_kill
=== PAUSE TestDaemonStop_kill
=== RUN   TestDaemonStop_killAdopted
=== PAUSE TestDaemonStop_killAdopted
=== RUN   TestDaemonStart_pidFile
=== PAUSE TestDaemonStart_pidFile
=== RUN   TestDaemonRestart_pidFile
=== PAUSE TestDaemonRestart_pidFile
=== RUN   TestDaemonEqual
=== RUN   TestDaemonEqual/Different_type
=== RUN   TestDaemonEqual/Nil
=== RUN   TestDaemonEqual/Equal
=== RUN   TestDaemonEqual/Different_proxy_ID
=== RUN   TestDaemonEqual/Different_path
=== RUN   TestDaemonEqual/Different_dir
=== RUN   TestDaemonEqual/Different_args
=== RUN   TestDaemonEqual/Different_token
--- PASS: TestDaemonEqual (0.00s)
    --- PASS: TestDaemonEqual/Different_type (0.00s)
    --- PASS: TestDaemonEqual/Nil (0.00s)
    --- PASS: TestDaemonEqual/Equal (0.00s)
    --- PASS: TestDaemonEqual/Different_proxy_ID (0.00s)
    --- PASS: TestDaemonEqual/Different_path (0.00s)
    --- PASS: TestDaemonEqual/Different_dir (0.00s)
    --- PASS: TestDaemonEqual/Different_args (0.00s)
    --- PASS: TestDaemonEqual/Different_token (0.00s)
=== RUN   TestDaemonMarshalSnapshot
=== RUN   TestDaemonMarshalSnapshot/stopped_daemon
=== RUN   TestDaemonMarshalSnapshot/basic
--- PASS: TestDaemonMarshalSnapshot (0.00s)
    --- PASS: TestDaemonMarshalSnapshot/stopped_daemon (0.00s)
    --- PASS: TestDaemonMarshalSnapshot/basic (0.00s)
=== RUN   TestDaemonUnmarshalSnapshot
=== PAUSE TestDaemonUnmarshalSnapshot
=== RUN   TestDaemonUnmarshalSnapshot_notRunning
=== PAUSE TestDaemonUnmarshalSnapshot_notRunning
=== RUN   TestManagerClose_noRun
=== PAUSE TestManagerClose_noRun
=== RUN   TestManagerRun_initialSync
=== PAUSE TestManagerRun_initialSync
=== RUN   TestManagerRun_syncNew
=== PAUSE TestManagerRun_syncNew
=== RUN   TestManagerRun_syncDelete
=== PAUSE TestManagerRun_syncDelete
=== RUN   TestManagerRun_syncUpdate
=== PAUSE TestManagerRun_syncUpdate
=== RUN   TestManagerRun_daemonLogs
=== PAUSE TestManagerRun_daemonLogs
=== RUN   TestManagerRun_daemonPid
--- SKIP: TestManagerRun_daemonPid (0.00s)
    manager_test.go:262: DM-skipped
=== RUN   TestManagerPassesEnvironment
=== PAUSE TestManagerPassesEnvironment
=== RUN   TestManagerPassesProxyEnv
--- SKIP: TestManagerPassesProxyEnv (0.00s)
    manager_test.go:354: DM-skipped
=== RUN   TestManagerRun_snapshotRestore
=== PAUSE TestManagerRun_snapshotRestore
=== RUN   TestManagerRun_rootDisallow
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/06 06:00:42 [WARN] agent/proxy: running as root, will not start managed proxies
2019/12/06 06:00:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
--- PASS: TestManagerRun_rootDisallow (0.11s)
=== RUN   TestNoop_impl
--- PASS: TestNoop_impl (0.00s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== CONT  TestDaemonStartStop
=== CONT  TestManagerClose_noRun
logger: 2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy209269400/file"}
=== CONT  TestManagerRun_daemonLogs
=== CONT  TestManagerRun_syncUpdate
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
--- PASS: TestManagerClose_noRun (0.01s)
=== CONT  TestManagerRun_snapshotRestore
2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "output", "/tmp/test-agent-proxy508552074/notify"}
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy738391884/file"}
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy388624478/file"}
logger: 2019/12/06 06:00:42 [INFO] agent/proxy: daemon exited with exit code: 0
=== CONT  TestManagerPassesEnvironment
--- PASS: TestDaemonStartStop (0.14s)
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "environ", "/tmp/test-agent-proxy568892224/env-variables"}
2019/12/06 06:00:42 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy738391884/file2"}
2019/12/06 06:00:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/06 06:00:42 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/06 06:00:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
--- PASS: TestManagerPassesEnvironment (0.11s)
=== CONT  TestManagerRun_syncDelete
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy962914930/file"}
2019/12/06 06:00:42 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestManagerRun_syncUpdate (0.30s)
=== CONT  TestManagerRun_syncNew
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/06 06:00:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/06 06:00:42 [INFO] agent/proxy: daemon left running
--- PASS: TestManagerRun_daemonLogs (0.32s)
=== CONT  TestManagerRun_initialSync
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/06 06:00:42 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy244623171/file"}
2019/12/06 06:00:42 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/12/06 06:00:42 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy388624478/file2"}
2019/12/06 06:00:43 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy419299949/file"}
2019/12/06 06:00:43 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/06 06:00:43 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestManagerRun_initialSync (0.12s)
=== CONT  TestDaemonUnmarshalSnapshot_notRunning
=== CONT  TestDaemonStart_pidFile
--- PASS: TestManagerRun_syncDelete (0.20s)
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy656155943/file"}
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-once", "/tmp/test-agent-proxy075913448/file"}
Unknown command: "start-once"
logger: 2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestDaemonUnmarshalSnapshot_notRunning (0.11s)
=== CONT  TestDaemonUnmarshalSnapshot
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy110890586/file"}
2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy419299949/file2"}
logger: 2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 2
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-once", "/tmp/test-agent-proxy075913448/file"}
Unknown command: "start-once"
logger: 2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 2
--- PASS: TestDaemonStart_pidFile (0.22s)
=== CONT  TestDaemonRestart_pidFile
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy026086897/file"}
2019/12/06 06:00:43 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
logger: 2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy419299949/file2"}
2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
logger: 2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy026086897/file"}
2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 1
--- PASS: TestManagerRun_syncNew (0.49s)
=== CONT  TestDaemonStop_kill
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "stop-kill", "/tmp/test-agent-proxy186770588/file"}
logger: 2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with exit code: 0
=== CONT  TestDaemonStop_killAdopted
--- PASS: TestDaemonRestart_pidFile (0.25s)
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: graceful wait of 200ms passed, killing
logger: 2019/12/06 06:00:43 [INFO] agent/proxy: daemon left running
--- PASS: TestDaemonStop_kill (0.36s)
=== CONT  TestDaemonLaunchesNewProcessGroup
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: graceful wait of 200ms passed, killing
2019/12/06 06:00:43 Started child
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy273962542/file"}
--- PASS: TestDaemonStop_killAdopted (0.37s)
=== CONT  TestDaemonRestart
logger: 2019/12/06 06:00:43 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy404509109/file"}
2019/12/06 06:00:43 [INFO] agent/proxy: daemon exited with error: process 15055 is dead or running as another user
--- PASS: TestManagerRun_snapshotRestore (1.33s)
logger: 2019/12/06 06:00:44 [INFO] agent/proxy: daemon exited with exit code: 0
logger: 2019/12/06 06:00:44 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy404509109/file"}
2019/12/06 06:00:44 Started child
logger: 2019/12/06 06:00:44 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build617967062/b747/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy273962542/file"}
logger: 2019/12/06 06:00:44 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestDaemonRestart (0.20s)
--- PASS: TestDaemonLaunchesNewProcessGroup (0.35s)
    daemon_test.go:224: Child PID was 15235 and still 15235
    daemon_test.go:241: Child PID was 15235 and is now 15267
logger: 2019/12/06 06:00:44 [INFO] agent/proxy: daemon exited with error: process 15158 is dead or running as another user
--- PASS: TestDaemonUnmarshalSnapshot (1.11s)
PASS
ok  	github.com/hashicorp/consul/agent/proxyprocess	1.969s
=== RUN   TestManagerInternal_cycleServer
--- PASS: TestManagerInternal_cycleServer (0.00s)
=== RUN   TestManagerInternal_getServerList
--- PASS: TestManagerInternal_getServerList (0.00s)
=== RUN   TestManagerInternal_New
--- PASS: TestManagerInternal_New (0.00s)
=== RUN   TestManagerInternal_reconcileServerList
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [WARN] manager: No servers available
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [WARN] manager: No servers available
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 2 servers, next active server is s01 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s00 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [WARN] manager: No servers available
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 2 servers, next active server is s03 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s05 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 8 servers, next active server is s04 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 72 servers, next active server is s83 (Addr: /) (DC: )
--- PASS: TestManagerInternal_reconcileServerList (0.00s)
=== RUN   TestManagerInternal_refreshServerRebalanceTimer
--- PASS: TestManagerInternal_refreshServerRebalanceTimer (0.00s)
=== RUN   TestManagerInternal_saveServerList
--- PASS: TestManagerInternal_saveServerList (0.00s)
=== RUN   TestRouter_Shutdown
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
--- PASS: TestRouter_Shutdown (0.00s)
=== RUN   TestRouter_Routing
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
2019/12/06 06:01:02 [INFO] manager: shutting down
--- PASS: TestRouter_Routing (0.00s)
=== RUN   TestRouter_Routing_Offline
2019/12/06 06:01:02 [DEBUG] manager: pinging server "node4.dc1 (Addr: tcp/127.0.0.5:8300) (DC: dc1)" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "node3.dc1 (Addr: tcp/127.0.0.4:8300) (DC: dc1)" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "node1.dc1 (Addr: tcp/127.0.0.2:8300) (DC: dc1)" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "node2.dc1 (Addr: tcp/127.0.0.3:8300) (DC: dc1)" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
--- PASS: TestRouter_Routing_Offline (0.00s)
=== RUN   TestRouter_GetDatacenters
--- PASS: TestRouter_GetDatacenters (0.00s)
=== RUN   TestRouter_distanceSorter
--- PASS: TestRouter_distanceSorter (0.00s)
=== RUN   TestRouter_GetDatacentersByDistance
--- PASS: TestRouter_GetDatacentersByDistance (0.00s)
=== RUN   TestRouter_GetDatacenterMaps
--- PASS: TestRouter_GetDatacenterMaps (0.00s)
=== RUN   TestServers_AddServer
--- PASS: TestServers_AddServer (0.00s)
=== RUN   TestServers_IsOffline
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: No healthy servers during rebalance, aborting
--- PASS: TestServers_IsOffline (0.01s)
=== RUN   TestServers_FindServer
2019/12/06 06:01:02 [WARN] manager: No servers available
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s1"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s2"
--- PASS: TestServers_FindServer (0.00s)
=== RUN   TestServers_New
--- PASS: TestServers_New (0.00s)
=== RUN   TestServers_NotifyFailedServer
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s1"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s2"
--- PASS: TestServers_NotifyFailedServer (0.00s)
=== RUN   TestServers_NumServers
--- PASS: TestServers_NumServers (0.00s)
=== RUN   TestServers_RebalanceServers
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s00 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s22 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s21 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s53 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s26 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s12 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s01 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s23 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s34 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s27 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s20 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s10 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s54 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s83 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s26 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s99 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s49 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s75 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s35 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s41 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s68 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s67 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s61 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s30 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s11 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s80 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s44 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s30 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s56 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s43 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s99 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s59 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s21 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s65 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s99 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s08 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s13 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s79 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s37 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s72 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s18 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s36 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s64 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s26 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s23 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s63 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s37 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s61 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s65 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s55 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s37 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s09 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s59 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s74 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s48 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s85 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s40 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s09 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s76 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s77 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s46 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s88 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s47 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s11 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s32 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s58 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s79 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s23 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s84 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s05 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s34 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s04 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s94 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s81 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s00 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s22 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s55 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s46 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s81 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s94 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s71 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s15 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s46 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s82 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s20 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s26 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s21 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s81 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s02 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s25 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s85 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s02 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s02 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s11 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s27 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s75 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s78 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s12 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s76 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s45 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s57 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s98 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s83 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s04 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s10 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s86 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s01 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s23 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s11 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s79 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s51 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s27 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s70 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s11 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s01 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s04 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s34 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s44 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s12 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s88 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s58 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s84 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s83 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s76 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s45 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s68 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s07 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s10 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s87 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s71 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s54 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s69 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s59 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s39 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s68 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s35 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s89 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s54 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s21 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s30 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s96 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s94 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s29 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s48 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: pinging server "s20 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:02 [DEBUG] manager: Rebalanced 100 servers, next active server is s41 (Addr: /) (DC: )
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:02 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s82 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s41 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s07 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s81 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s25 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s32 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s33 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s49 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s85 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s30 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s66 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s79 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s01 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s45 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s36 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s83 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s02 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s96 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s01 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s40 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s38 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s06 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s57 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s49 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s86 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s91 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s27 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s28 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s98 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s53 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s85 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s51 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s83 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s64 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s69 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s92 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s58 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s44 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s33 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s37 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s65 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s90 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s71 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s80 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s14 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s16 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s35 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s44 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s88 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 100 servers, next active server is s79 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s79"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s90"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s67"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s43"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s50"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s21"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s54"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s34"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s88"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s95"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s99"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s72"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s52"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s46"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s53"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s35"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s39"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s36"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s78"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s87"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s61"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s26"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s65"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s91"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s27"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s81"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s82"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s60"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s68"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s04"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s51"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s30"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s59"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s73"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s66"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s28"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s80"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s96"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s48"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s97"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s47"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s24"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s42"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s44"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s64"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s25"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s40"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s71"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s56"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s06"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s69"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s57"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s98"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s86"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s74"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s84"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s55"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s22"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s63"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s37"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s29"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s05"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s49"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s92"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s32"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s33"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s31"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s38"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s76"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s75"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s89"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s20"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s62"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s93"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s94"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s85"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s41"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s45"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s83"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s77"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s00"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s58"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s70"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s23"
--- PASS: TestServers_RebalanceServers (0.68s)
=== RUN   TestServers_RebalanceServers_AvoidFailed
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
--- PASS: TestServers_RebalanceServers_AvoidFailed (0.01s)
=== RUN   TestManager_RemoveServer
2019/12/06 06:01:03 [DEBUG] manager: Rebalanced 19 servers, next active server is s06 (Addr: /) (DC: )
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s19"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s12"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s10"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s14"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s02"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s11"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s17"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s13"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s18"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s15"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s08"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s07"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s16"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s01"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s03"
2019/12/06 06:01:03 [DEBUG] manager: cycled away from server "s09"
--- PASS: TestManager_RemoveServer (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/router	0.788s
=== RUN   TestStructs_ACLCaches
=== PAUSE TestStructs_ACLCaches
=== RUN   TestStructs_ACL_IsSame
--- PASS: TestStructs_ACL_IsSame (0.00s)
=== RUN   TestStructs_ACL_Convert
=== PAUSE TestStructs_ACL_Convert
=== RUN   TestStructs_ACLToken_Convert
=== PAUSE TestStructs_ACLToken_Convert
=== RUN   TestStructs_ACLToken_PolicyIDs
=== PAUSE TestStructs_ACLToken_PolicyIDs
=== RUN   TestStructs_ACLToken_EmbeddedPolicy
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy
=== RUN   TestStructs_ACLServiceIdentity_SyntheticPolicy
=== PAUSE TestStructs_ACLServiceIdentity_SyntheticPolicy
=== RUN   TestStructs_ACLToken_SetHash
=== PAUSE TestStructs_ACLToken_SetHash
=== RUN   TestStructs_ACLToken_EstimateSize
=== PAUSE TestStructs_ACLToken_EstimateSize
=== RUN   TestStructs_ACLToken_Stub
=== PAUSE TestStructs_ACLToken_Stub
=== RUN   TestStructs_ACLTokens_Sort
=== PAUSE TestStructs_ACLTokens_Sort
=== RUN   TestStructs_ACLTokenListStubs_Sort
=== PAUSE TestStructs_ACLTokenListStubs_Sort
=== RUN   TestStructs_ACLPolicy_Stub
=== PAUSE TestStructs_ACLPolicy_Stub
=== RUN   TestStructs_ACLPolicy_SetHash
=== PAUSE TestStructs_ACLPolicy_SetHash
=== RUN   TestStructs_ACLPolicy_EstimateSize
=== PAUSE TestStructs_ACLPolicy_EstimateSize
=== RUN   TestStructs_ACLPolicies_Sort
=== PAUSE TestStructs_ACLPolicies_Sort
=== RUN   TestStructs_ACLPolicyListStubs_Sort
=== PAUSE TestStructs_ACLPolicyListStubs_Sort
=== RUN   TestStructs_ACLPolicies_resolveWithCache
=== PAUSE TestStructs_ACLPolicies_resolveWithCache
=== RUN   TestStructs_ACLPolicies_Compile
=== PAUSE TestStructs_ACLPolicies_Compile
=== RUN   TestCheckDefinition_Defaults
=== PAUSE TestCheckDefinition_Defaults
=== RUN   TestCheckDefinition_CheckType
=== PAUSE TestCheckDefinition_CheckType
=== RUN   TestCheckDefinitionToCheckType
=== PAUSE TestCheckDefinitionToCheckType
=== RUN   TestDecodeConfigEntry
=== PAUSE TestDecodeConfigEntry
=== RUN   TestServiceConfigResponse_MsgPack
--- PASS: TestServiceConfigResponse_MsgPack (0.00s)
=== RUN   TestConfigEntryResponseMarshalling
=== PAUSE TestConfigEntryResponseMarshalling
=== RUN   TestCAConfiguration_GetCommonConfig
=== RUN   TestCAConfiguration_GetCommonConfig/basic_defaults
=== RUN   TestCAConfiguration_GetCommonConfig/basic_defaults_after_encoding_fun
--- PASS: TestCAConfiguration_GetCommonConfig (0.00s)
    --- PASS: TestCAConfiguration_GetCommonConfig/basic_defaults (0.00s)
    --- PASS: TestCAConfiguration_GetCommonConfig/basic_defaults_after_encoding_fun (0.00s)
=== RUN   TestConnectProxyConfig_ToAPI
=== RUN   TestConnectProxyConfig_ToAPI/service
--- PASS: TestConnectProxyConfig_ToAPI (0.00s)
    --- PASS: TestConnectProxyConfig_ToAPI/service (0.00s)
=== RUN   TestUpstream_MarshalJSON
=== RUN   TestUpstream_MarshalJSON/service
=== RUN   TestUpstream_MarshalJSON/pq
--- PASS: TestUpstream_MarshalJSON (0.00s)
    --- PASS: TestUpstream_MarshalJSON/service (0.00s)
    --- PASS: TestUpstream_MarshalJSON/pq (0.00s)
=== RUN   TestUpstream_UnmarshalJSON
=== RUN   TestUpstream_UnmarshalJSON/service
=== RUN   TestUpstream_UnmarshalJSON/pq
--- PASS: TestUpstream_UnmarshalJSON (0.00s)
    --- PASS: TestUpstream_UnmarshalJSON/service (0.00s)
    --- PASS: TestUpstream_UnmarshalJSON/pq (0.00s)
=== RUN   TestConnectManagedProxy_ParseConfig
=== RUN   TestConnectManagedProxy_ParseConfig/empty
=== RUN   TestConnectManagedProxy_ParseConfig/specified
=== RUN   TestConnectManagedProxy_ParseConfig/stringy_port
=== RUN   TestConnectManagedProxy_ParseConfig/empty_addr
=== RUN   TestConnectManagedProxy_ParseConfig/empty_port
=== RUN   TestConnectManagedProxy_ParseConfig/junk_address
=== RUN   TestConnectManagedProxy_ParseConfig/zero_port,_missing_addr
=== RUN   TestConnectManagedProxy_ParseConfig/extra_fields_present
--- PASS: TestConnectManagedProxy_ParseConfig (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/specified (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/stringy_port (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty_addr (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty_port (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/junk_address (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/zero_port,_missing_addr (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/extra_fields_present (0.00s)
=== RUN   TestIntentionGetACLPrefix
=== RUN   TestIntentionGetACLPrefix/unset_name
=== RUN   TestIntentionGetACLPrefix/set_name
--- PASS: TestIntentionGetACLPrefix (0.00s)
    --- PASS: TestIntentionGetACLPrefix/unset_name (0.00s)
    --- PASS: TestIntentionGetACLPrefix/set_name (0.00s)
=== RUN   TestIntentionValidate
=== RUN   TestIntentionValidate/long_description
=== RUN   TestIntentionValidate/no_action_set
=== RUN   TestIntentionValidate/no_SourceNS
=== RUN   TestIntentionValidate/no_SourceName
=== RUN   TestIntentionValidate/no_DestinationNS
=== RUN   TestIntentionValidate/no_DestinationName
=== RUN   TestIntentionValidate/SourceNS_partial_wildcard
=== RUN   TestIntentionValidate/SourceName_partial_wildcard
=== RUN   TestIntentionValidate/SourceName_exact_following_wildcard
=== RUN   TestIntentionValidate/DestinationNS_partial_wildcard
=== RUN   TestIntentionValidate/DestinationName_partial_wildcard
=== RUN   TestIntentionValidate/DestinationName_exact_following_wildcard
=== RUN   TestIntentionValidate/SourceType_is_not_set
=== RUN   TestIntentionValidate/SourceType_is_other
--- PASS: TestIntentionValidate (0.01s)
    --- PASS: TestIntentionValidate/long_description (0.00s)
    --- PASS: TestIntentionValidate/no_action_set (0.00s)
    --- PASS: TestIntentionValidate/no_SourceNS (0.00s)
    --- PASS: TestIntentionValidate/no_SourceName (0.00s)
    --- PASS: TestIntentionValidate/no_DestinationNS (0.00s)
    --- PASS: TestIntentionValidate/no_DestinationName (0.00s)
    --- PASS: TestIntentionValidate/SourceNS_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceName_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceName_exact_following_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationNS_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationName_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationName_exact_following_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceType_is_not_set (0.00s)
    --- PASS: TestIntentionValidate/SourceType_is_other (0.00s)
=== RUN   TestIntentionPrecedenceSorter
=== RUN   TestIntentionPrecedenceSorter/exhaustive_list
=== RUN   TestIntentionPrecedenceSorter/tiebreak_deterministically
--- PASS: TestIntentionPrecedenceSorter (0.00s)
    --- PASS: TestIntentionPrecedenceSorter/exhaustive_list (0.00s)
    --- PASS: TestIntentionPrecedenceSorter/tiebreak_deterministically (0.00s)
=== RUN   TestStructs_PreparedQuery_GetACLPrefix
--- PASS: TestStructs_PreparedQuery_GetACLPrefix (0.00s)
=== RUN   TestAgentStructs_CheckTypes
=== PAUSE TestAgentStructs_CheckTypes
=== RUN   TestServiceDefinitionValidate
=== RUN   TestServiceDefinitionValidate/valid
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_a_port_set
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_no_port_set
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_native_set
--- PASS: TestServiceDefinitionValidate (0.00s)
    --- PASS: TestServiceDefinitionValidate/valid (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_a_port_set (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_no_port_set (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_native_set (0.00s)
=== RUN   TestServiceDefinitionConnectProxy_json
=== RUN   TestServiceDefinitionConnectProxy_json/no_config
=== RUN   TestServiceDefinitionConnectProxy_json/basic_config
=== RUN   TestServiceDefinitionConnectProxy_json/config_with_upstreams
--- PASS: TestServiceDefinitionConnectProxy_json (0.00s)
    --- PASS: TestServiceDefinitionConnectProxy_json/no_config (0.00s)
        service_definition_test.go:196: error: %!s(<nil>)
    --- PASS: TestServiceDefinitionConnectProxy_json/basic_config (0.00s)
        service_definition_test.go:196: error: %!s(<nil>)
    --- PASS: TestServiceDefinitionConnectProxy_json/config_with_upstreams (0.00s)
        service_definition_test.go:196: error: %!s(<nil>)
=== RUN   TestStructs_FilterFieldConfigurations
=== PAUSE TestStructs_FilterFieldConfigurations
=== RUN   TestEncodeDecode
--- PASS: TestEncodeDecode (0.00s)
=== RUN   TestStructs_Implements
--- PASS: TestStructs_Implements (0.00s)
=== RUN   TestStructs_RegisterRequest_ChangesNode
--- PASS: TestStructs_RegisterRequest_ChangesNode (0.00s)
=== RUN   TestNode_IsSame
--- PASS: TestNode_IsSame (0.00s)
=== RUN   TestStructs_ServiceNode_IsSameService
--- PASS: TestStructs_ServiceNode_IsSameService (0.00s)
=== RUN   TestStructs_ServiceNode_PartialClone
--- PASS: TestStructs_ServiceNode_PartialClone (0.00s)
=== RUN   TestStructs_ServiceNode_Conversions
--- PASS: TestStructs_ServiceNode_Conversions (0.00s)
=== RUN   TestStructs_NodeService_ValidateConnectProxy
=== RUN   TestStructs_NodeService_ValidateConnectProxy/valid
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_whitespace_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_valid_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_port_set
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_ConnectNative_set
--- PASS: TestStructs_NodeService_ValidateConnectProxy (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/valid (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_whitespace_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_valid_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_port_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_ConnectNative_set (0.00s)
=== RUN   TestStructs_NodeService_ValidateSidecarService
=== RUN   TestStructs_NodeService_ValidateSidecarService/valid
=== RUN   TestStructs_NodeService_ValidateSidecarService/ID_can't_be_set
=== RUN   TestStructs_NodeService_ValidateSidecarService/Nested_sidecar_can't_be_set
=== RUN   TestStructs_NodeService_ValidateSidecarService/Sidecar_can't_have_managed_proxy
--- PASS: TestStructs_NodeService_ValidateSidecarService (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/valid (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/ID_can't_be_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/Nested_sidecar_can't_be_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/Sidecar_can't_have_managed_proxy (0.00s)
=== RUN   TestStructs_NodeService_IsSame
--- PASS: TestStructs_NodeService_IsSame (0.00s)
=== RUN   TestStructs_HealthCheck_IsSame
--- PASS: TestStructs_HealthCheck_IsSame (0.00s)
=== RUN   TestStructs_HealthCheck_Marshalling
--- PASS: TestStructs_HealthCheck_Marshalling (0.00s)
=== RUN   TestStructs_HealthCheck_Clone
--- PASS: TestStructs_HealthCheck_Clone (0.00s)
=== RUN   TestStructs_CheckServiceNodes_Shuffle
--- PASS: TestStructs_CheckServiceNodes_Shuffle (0.01s)
=== RUN   TestStructs_CheckServiceNodes_Filter
--- PASS: TestStructs_CheckServiceNodes_Filter (0.00s)
=== RUN   TestStructs_DirEntry_Clone
--- PASS: TestStructs_DirEntry_Clone (0.00s)
=== RUN   TestStructs_ValidateMetadata
--- PASS: TestStructs_ValidateMetadata (0.00s)
=== RUN   TestStructs_validateMetaPair
--- PASS: TestStructs_validateMetaPair (0.00s)
=== RUN   TestSpecificServiceRequest_CacheInfo
=== RUN   TestSpecificServiceRequest_CacheInfo/basic_params
=== RUN   TestSpecificServiceRequest_CacheInfo/name_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/node_meta_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/address_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/tag_filter_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/connect_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/tags_should_be_different
=== RUN   TestSpecificServiceRequest_CacheInfo/tags_should_not_depend_on_order
=== RUN   TestSpecificServiceRequest_CacheInfo/legacy_requests_with_singular_tag_should_be_different
--- PASS: TestSpecificServiceRequest_CacheInfo (0.01s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/basic_params (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/name_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/node_meta_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/address_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tag_filter_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/connect_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tags_should_be_different (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tags_should_not_depend_on_order (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/legacy_requests_with_singular_tag_should_be_different (0.00s)
=== CONT  TestStructs_ACLCaches
=== RUN   TestStructs_ACLCaches/New
=== PAUSE TestStructs_ACLCaches/New
=== CONT  TestStructs_ACLPolicy_EstimateSize
=== CONT  TestStructs_FilterFieldConfigurations
=== RUN   TestStructs_FilterFieldConfigurations/NodeService
=== CONT  TestAgentStructs_CheckTypes
=== PAUSE TestStructs_FilterFieldConfigurations/NodeService
=== RUN   TestStructs_ACLCaches/Identities
=== CONT  TestConfigEntryResponseMarshalling
=== RUN   TestConfigEntryResponseMarshalling/nil_entry
--- PASS: TestStructs_ACLPolicy_EstimateSize (0.00s)
=== RUN   TestStructs_FilterFieldConfigurations/ServiceNode
=== PAUSE TestStructs_FilterFieldConfigurations/ServiceNode
=== PAUSE TestStructs_ACLCaches/Identities
--- PASS: TestAgentStructs_CheckTypes (0.00s)
=== RUN   TestStructs_FilterFieldConfigurations/HealthCheck
=== PAUSE TestStructs_FilterFieldConfigurations/HealthCheck
=== CONT  TestDecodeConfigEntry
=== PAUSE TestConfigEntryResponseMarshalling/nil_entry
=== RUN   TestConfigEntryResponseMarshalling/proxy-default_entry
=== PAUSE TestConfigEntryResponseMarshalling/proxy-default_entry
=== RUN   TestStructs_ACLCaches/Policies
=== PAUSE TestStructs_ACLCaches/Policies
=== RUN   TestStructs_ACLCaches/ParsedPolicies
=== PAUSE TestStructs_ACLCaches/ParsedPolicies
=== RUN   TestStructs_FilterFieldConfigurations/CheckServiceNode
=== RUN   TestStructs_ACLCaches/Authorizers
=== PAUSE TestStructs_FilterFieldConfigurations/CheckServiceNode
=== RUN   TestConfigEntryResponseMarshalling/service-default_entry
=== RUN   TestStructs_FilterFieldConfigurations/NodeInfo
=== PAUSE TestConfigEntryResponseMarshalling/service-default_entry
=== RUN   TestDecodeConfigEntry/proxy-defaults
=== PAUSE TestStructs_ACLCaches/Authorizers
=== PAUSE TestStructs_FilterFieldConfigurations/NodeInfo
=== RUN   TestStructs_ACLCaches/Roles
=== RUN   TestStructs_FilterFieldConfigurations/api.AgentService
=== PAUSE TestDecodeConfigEntry/proxy-defaults
=== RUN   TestDecodeConfigEntry/proxy-defaults_translations
=== PAUSE TestDecodeConfigEntry/proxy-defaults_translations
=== RUN   TestDecodeConfigEntry/service-defaults
=== PAUSE TestDecodeConfigEntry/service-defaults
=== RUN   TestDecodeConfigEntry/service-defaults_translations
=== PAUSE TestDecodeConfigEntry/service-defaults_translations
=== CONT  TestCheckDefinitionToCheckType
=== PAUSE TestStructs_ACLCaches/Roles
--- PASS: TestCheckDefinitionToCheckType (0.00s)
=== CONT  TestCheckDefinition_Defaults
=== CONT  TestCheckDefinition_CheckType
--- PASS: TestCheckDefinition_Defaults (0.00s)
=== CONT  TestStructs_ACLPolicies_resolveWithCache
=== PAUSE TestStructs_FilterFieldConfigurations/api.AgentService
=== CONT  TestStructs_ACLPolicies_Compile
=== RUN   TestStructs_FilterFieldConfigurations/Node
=== PAUSE TestStructs_FilterFieldConfigurations/Node
=== CONT  TestStructs_ACLPolicyListStubs_Sort
=== RUN   TestStructs_ACLPolicies_Compile/Cache_Miss
--- PASS: TestStructs_ACLPolicyListStubs_Sort (0.00s)
=== CONT  TestStructs_ACLPolicies_Sort
--- PASS: TestStructs_ACLPolicies_Sort (0.00s)
=== CONT  TestStructs_ACLToken_EstimateSize
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Cache_Misses
--- PASS: TestStructs_ACLToken_EstimateSize (0.00s)
=== CONT  TestStructs_ACLPolicy_SetHash
=== RUN   TestStructs_ACLPolicy_SetHash/Nil_Hash_-_Generate
=== RUN   TestStructs_ACLPolicy_SetHash/Hash_Set_-_Dont_Generate
=== RUN   TestStructs_ACLPolicy_SetHash/Hash_Set_-_Generate
--- PASS: TestStructs_ACLPolicy_SetHash (0.00s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Nil_Hash_-_Generate (0.00s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Hash_Set_-_Dont_Generate (0.00s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Hash_Set_-_Generate (0.00s)
=== CONT  TestStructs_ACLPolicy_Stub
--- PASS: TestStructs_ACLPolicy_Stub (0.00s)
=== CONT  TestStructs_ACLTokenListStubs_Sort
--- PASS: TestStructs_ACLTokenListStubs_Sort (0.00s)
=== CONT  TestStructs_ACLTokens_Sort
--- PASS: TestStructs_ACLTokens_Sort (0.00s)
=== CONT  TestStructs_ACLToken_Stub
=== RUN   TestStructs_ACLToken_Stub/Basic
=== PAUSE TestStructs_ACLToken_Stub/Basic
=== RUN   TestStructs_ACLToken_Stub/Legacy
=== PAUSE TestStructs_ACLToken_Stub/Legacy
=== CONT  TestStructs_ACLToken_Convert
=== RUN   TestStructs_ACLToken_Convert/Management
=== PAUSE TestStructs_ACLToken_Convert/Management
=== RUN   TestStructs_ACLToken_Convert/Client
=== PAUSE TestStructs_ACLToken_Convert/Client
=== RUN   TestStructs_ACLToken_Convert/Unconvertible
=== PAUSE TestStructs_ACLToken_Convert/Unconvertible
=== CONT  TestStructs_ACLToken_PolicyIDs
=== RUN   TestStructs_ACLToken_PolicyIDs/Basic
=== PAUSE TestStructs_ACLToken_PolicyIDs/Basic
=== RUN   TestStructs_ACLToken_PolicyIDs/Legacy_Management
=== PAUSE TestStructs_ACLToken_PolicyIDs/Legacy_Management
=== RUN   TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
=== PAUSE TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
=== RUN   TestStructs_ACLToken_PolicyIDs/No_Policies
=== PAUSE TestStructs_ACLToken_PolicyIDs/No_Policies
=== CONT  TestStructs_ACL_Convert
--- PASS: TestStructs_ACL_Convert (0.00s)
=== CONT  TestStructs_ACLToken_SetHash
=== RUN   TestStructs_ACLToken_SetHash/Nil_Hash_-_Generate
=== RUN   TestStructs_ACLToken_SetHash/Hash_Set_-_Dont_Generate
=== RUN   TestStructs_ACLToken_SetHash/Hash_Set_-_Generate
=== RUN   TestStructs_ACLPolicies_Compile/Check_Cache
=== RUN   TestStructs_ACLPolicies_Compile/Cache_Hit
--- PASS: TestStructs_ACLPolicies_Compile (0.01s)
    --- PASS: TestStructs_ACLPolicies_Compile/Cache_Miss (0.01s)
    --- PASS: TestStructs_ACLPolicies_Compile/Check_Cache (0.00s)
    --- PASS: TestStructs_ACLPolicies_Compile/Cache_Hit (0.00s)
=== CONT  TestStructs_ACLServiceIdentity_SyntheticPolicy
=== RUN   TestStructs_ACLServiceIdentity_SyntheticPolicy/web
=== RUN   TestStructs_ACLServiceIdentity_SyntheticPolicy/companion-cube-99_[dc1,_dc2]
--- PASS: TestStructs_ACLToken_SetHash (0.00s)
    --- PASS: TestStructs_ACLToken_SetHash/Nil_Hash_-_Generate (0.00s)
    --- PASS: TestStructs_ACLToken_SetHash/Hash_Set_-_Dont_Generate (0.00s)
    --- PASS: TestStructs_ACLToken_SetHash/Hash_Set_-_Generate (0.00s)
=== CONT  TestStructs_ACLToken_EmbeddedPolicy
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== CONT  TestConfigEntryResponseMarshalling/nil_entry
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Check_Cache
--- PASS: TestStructs_ACLServiceIdentity_SyntheticPolicy (0.00s)
    --- PASS: TestStructs_ACLServiceIdentity_SyntheticPolicy/web (0.00s)
    --- PASS: TestStructs_ACLServiceIdentity_SyntheticPolicy/companion-cube-99_[dc1,_dc2] (0.00s)
=== CONT  TestConfigEntryResponseMarshalling/service-default_entry
=== CONT  TestDecodeConfigEntry/proxy-defaults
=== CONT  TestConfigEntryResponseMarshalling/proxy-default_entry
=== CONT  TestDecodeConfigEntry/service-defaults_translations
=== CONT  TestDecodeConfigEntry/service-defaults
=== CONT  TestDecodeConfigEntry/proxy-defaults_translations
--- PASS: TestConfigEntryResponseMarshalling (0.00s)
    --- PASS: TestConfigEntryResponseMarshalling/nil_entry (0.00s)
    --- PASS: TestConfigEntryResponseMarshalling/service-default_entry (0.00s)
    --- PASS: TestConfigEntryResponseMarshalling/proxy-default_entry (0.00s)
=== CONT  TestStructs_ACLCaches/New
=== CONT  TestStructs_ACLCaches/Authorizers
--- PASS: TestDecodeConfigEntry (0.00s)
    --- PASS: TestDecodeConfigEntry/proxy-defaults (0.00s)
    --- PASS: TestDecodeConfigEntry/service-defaults_translations (0.00s)
    --- PASS: TestDecodeConfigEntry/service-defaults (0.00s)
    --- PASS: TestDecodeConfigEntry/proxy-defaults_translations (0.00s)
=== CONT  TestStructs_ACLCaches/Roles
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Cache_Hits
=== CONT  TestStructs_ACLCaches/Policies
=== CONT  TestStructs_ACLCaches/ParsedPolicies
=== RUN   TestStructs_ACLCaches/New/Valid_Sizes
=== PAUSE TestStructs_ACLCaches/New/Valid_Sizes
=== RUN   TestStructs_ACLCaches/New/Zero_Sizes
=== PAUSE TestStructs_ACLCaches/New/Zero_Sizes
=== CONT  TestStructs_FilterFieldConfigurations/NodeService
=== CONT  TestStructs_ACLCaches/Identities
=== CONT  TestStructs_FilterFieldConfigurations/Node
--- PASS: TestCheckDefinition_CheckType (0.01s)
=== CONT  TestStructs_FilterFieldConfigurations/api.AgentService
--- PASS: TestStructs_ACLPolicies_resolveWithCache (0.02s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Cache_Misses (0.00s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Check_Cache (0.00s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Cache_Hits (0.00s)
=== CONT  TestStructs_FilterFieldConfigurations/CheckServiceNode
=== CONT  TestStructs_ACLToken_Stub/Basic
=== CONT  TestStructs_FilterFieldConfigurations/HealthCheck
=== CONT  TestStructs_FilterFieldConfigurations/ServiceNode
=== CONT  TestStructs_ACLToken_Convert/Management
=== CONT  TestStructs_ACLToken_Stub/Legacy
--- PASS: TestStructs_ACLToken_Stub (0.00s)
    --- PASS: TestStructs_ACLToken_Stub/Basic (0.00s)
    --- PASS: TestStructs_ACLToken_Stub/Legacy (0.00s)
=== CONT  TestStructs_ACLToken_PolicyIDs/Basic
=== CONT  TestStructs_ACLToken_Convert/Unconvertible
=== CONT  TestStructs_ACLToken_Convert/Client
--- PASS: TestStructs_ACLToken_Convert (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Management (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Unconvertible (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Client (0.00s)
=== CONT  TestStructs_FilterFieldConfigurations/NodeInfo
=== CONT  TestStructs_ACLToken_PolicyIDs/No_Policies
=== CONT  TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
=== CONT  TestStructs_ACLToken_PolicyIDs/Legacy_Management
--- PASS: TestStructs_ACLToken_PolicyIDs (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Basic (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/No_Policies (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Legacy_Management (0.00s)
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
--- PASS: TestStructs_ACLToken_EmbeddedPolicy (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/No_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client (0.00s)
=== CONT  TestStructs_ACLCaches/New/Valid_Sizes
=== CONT  TestStructs_ACLCaches/New/Zero_Sizes
--- PASS: TestStructs_ACLCaches (0.01s)
    --- PASS: TestStructs_ACLCaches/Authorizers (0.00s)
    --- PASS: TestStructs_ACLCaches/Roles (0.00s)
    --- PASS: TestStructs_ACLCaches/Policies (0.00s)
    --- PASS: TestStructs_ACLCaches/ParsedPolicies (0.00s)
    --- PASS: TestStructs_ACLCaches/Identities (0.00s)
    --- PASS: TestStructs_ACLCaches/New (0.00s)
        --- PASS: TestStructs_ACLCaches/New/Valid_Sizes (0.00s)
        --- PASS: TestStructs_ACLCaches/New/Zero_Sizes (0.00s)
--- PASS: TestStructs_FilterFieldConfigurations (0.01s)
    --- PASS: TestStructs_FilterFieldConfigurations/NodeService (0.00s)
    --- PASS: TestStructs_FilterFieldConfigurations/Node (0.00s)
    --- PASS: TestStructs_FilterFieldConfigurations/api.AgentService (0.00s)
    --- PASS: TestStructs_FilterFieldConfigurations/HealthCheck (0.00s)
    --- PASS: TestStructs_FilterFieldConfigurations/CheckServiceNode (0.01s)
    --- PASS: TestStructs_FilterFieldConfigurations/ServiceNode (0.01s)
    --- PASS: TestStructs_FilterFieldConfigurations/NodeInfo (0.01s)
PASS
ok  	github.com/hashicorp/consul/agent/structs	0.191s
?   	github.com/hashicorp/consul/agent/systemd	[no test files]
=== RUN   TestStore_RegularTokens
=== PAUSE TestStore_RegularTokens
=== RUN   TestStore_AgentMasterToken
=== PAUSE TestStore_AgentMasterToken
=== CONT  TestStore_RegularTokens
=== RUN   TestStore_RegularTokens/set_user_-_config
=== CONT  TestStore_AgentMasterToken
=== PAUSE TestStore_RegularTokens/set_user_-_config
=== RUN   TestStore_RegularTokens/set_user_-_api
=== PAUSE TestStore_RegularTokens/set_user_-_api
=== RUN   TestStore_RegularTokens/set_agent_-_config
=== PAUSE TestStore_RegularTokens/set_agent_-_config
=== RUN   TestStore_RegularTokens/set_agent_-_api
=== PAUSE TestStore_RegularTokens/set_agent_-_api
--- PASS: TestStore_AgentMasterToken (0.00s)
=== RUN   TestStore_RegularTokens/set_user_and_agent
=== PAUSE TestStore_RegularTokens/set_user_and_agent
=== RUN   TestStore_RegularTokens/set_repl_-_config
=== PAUSE TestStore_RegularTokens/set_repl_-_config
=== RUN   TestStore_RegularTokens/set_repl_-_api
=== PAUSE TestStore_RegularTokens/set_repl_-_api
=== RUN   TestStore_RegularTokens/set_master_-_config
=== PAUSE TestStore_RegularTokens/set_master_-_config
=== RUN   TestStore_RegularTokens/set_master_-_api
=== PAUSE TestStore_RegularTokens/set_master_-_api
=== RUN   TestStore_RegularTokens/set_all
=== PAUSE TestStore_RegularTokens/set_all
=== CONT  TestStore_RegularTokens/set_user_-_config
=== CONT  TestStore_RegularTokens/set_all
=== CONT  TestStore_RegularTokens/set_master_-_api
=== CONT  TestStore_RegularTokens/set_master_-_config
=== CONT  TestStore_RegularTokens/set_repl_-_api
=== CONT  TestStore_RegularTokens/set_repl_-_config
=== CONT  TestStore_RegularTokens/set_user_and_agent
=== CONT  TestStore_RegularTokens/set_user_-_api
=== CONT  TestStore_RegularTokens/set_agent_-_api
=== CONT  TestStore_RegularTokens/set_agent_-_config
--- PASS: TestStore_RegularTokens (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_all (0.00s)
    --- PASS: TestStore_RegularTokens/set_master_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_master_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_repl_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_repl_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_and_agent (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_agent_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_agent_-_config (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/token	0.047s
=== RUN   TestClustersFromSnapshot
=== RUN   TestClustersFromSnapshot/defaults
=== RUN   TestClustersFromSnapshot/custom-local-app
=== RUN   TestClustersFromSnapshot/custom-local-app-typed
=== RUN   TestClustersFromSnapshot/custom-upstream
=== RUN   TestClustersFromSnapshot/custom-upstream-typed
=== RUN   TestClustersFromSnapshot/custom-upstream-ignores-tls
=== RUN   TestClustersFromSnapshot/custom-timeouts
--- PASS: TestClustersFromSnapshot (0.73s)
    --- PASS: TestClustersFromSnapshot/defaults (0.17s)
    --- PASS: TestClustersFromSnapshot/custom-local-app (0.09s)
    --- PASS: TestClustersFromSnapshot/custom-local-app-typed (0.05s)
    --- PASS: TestClustersFromSnapshot/custom-upstream (0.22s)
    --- PASS: TestClustersFromSnapshot/custom-upstream-typed (0.04s)
    --- PASS: TestClustersFromSnapshot/custom-upstream-ignores-tls (0.03s)
    --- PASS: TestClustersFromSnapshot/custom-timeouts (0.13s)
=== RUN   TestParseProxyConfig
=== RUN   TestParseProxyConfig/defaults_-_nil
=== RUN   TestParseProxyConfig/defaults_-_empty
=== RUN   TestParseProxyConfig/defaults_-_other_stuff
=== RUN   TestParseProxyConfig/protocol_override
=== RUN   TestParseProxyConfig/protocol_uppercase_override
=== RUN   TestParseProxyConfig/local_connect_timeout_override,_string
=== RUN   TestParseProxyConfig/local_connect_timeout_override,_float_
=== RUN   TestParseProxyConfig/local_connect_timeout_override,_int_
--- PASS: TestParseProxyConfig (0.00s)
    --- PASS: TestParseProxyConfig/defaults_-_nil (0.00s)
    --- PASS: TestParseProxyConfig/defaults_-_empty (0.00s)
    --- PASS: TestParseProxyConfig/defaults_-_other_stuff (0.00s)
    --- PASS: TestParseProxyConfig/protocol_override (0.00s)
    --- PASS: TestParseProxyConfig/protocol_uppercase_override (0.00s)
    --- PASS: TestParseProxyConfig/local_connect_timeout_override,_string (0.00s)
    --- PASS: TestParseProxyConfig/local_connect_timeout_override,_float_ (0.00s)
    --- PASS: TestParseProxyConfig/local_connect_timeout_override,_int_ (0.00s)
=== RUN   TestParseUpstreamConfig
=== RUN   TestParseUpstreamConfig/defaults_-_nil
=== RUN   TestParseUpstreamConfig/defaults_-_empty
=== RUN   TestParseUpstreamConfig/defaults_-_other_stuff
=== RUN   TestParseUpstreamConfig/protocol_override
=== RUN   TestParseUpstreamConfig/connect_timeout_override,_string
=== RUN   TestParseUpstreamConfig/connect_timeout_override,_float_
=== RUN   TestParseUpstreamConfig/connect_timeout_override,_int_
--- PASS: TestParseUpstreamConfig (0.00s)
    --- PASS: TestParseUpstreamConfig/defaults_-_nil (0.00s)
    --- PASS: TestParseUpstreamConfig/defaults_-_empty (0.00s)
    --- PASS: TestParseUpstreamConfig/defaults_-_other_stuff (0.00s)
    --- PASS: TestParseUpstreamConfig/protocol_override (0.00s)
    --- PASS: TestParseUpstreamConfig/connect_timeout_override,_string (0.00s)
    --- PASS: TestParseUpstreamConfig/connect_timeout_override,_float_ (0.00s)
    --- PASS: TestParseUpstreamConfig/connect_timeout_override,_int_ (0.00s)
=== RUN   Test_makeLoadAssignment
=== RUN   Test_makeLoadAssignment/no_instances
=== RUN   Test_makeLoadAssignment/instances,_no_weights
=== RUN   Test_makeLoadAssignment/instances,_healthy_weights
=== RUN   Test_makeLoadAssignment/instances,_warning_weights
--- PASS: Test_makeLoadAssignment (0.01s)
    --- PASS: Test_makeLoadAssignment/no_instances (0.00s)
    --- PASS: Test_makeLoadAssignment/instances,_no_weights (0.00s)
    --- PASS: Test_makeLoadAssignment/instances,_healthy_weights (0.00s)
    --- PASS: Test_makeLoadAssignment/instances,_warning_weights (0.00s)
=== RUN   TestListenersFromSnapshot
=== RUN   TestListenersFromSnapshot/defaults
=== RUN   TestListenersFromSnapshot/http-public-listener
=== RUN   TestListenersFromSnapshot/http-upstream
=== RUN   TestListenersFromSnapshot/custom-public-listener
=== RUN   TestListenersFromSnapshot/custom-public-listener-typed
=== RUN   TestListenersFromSnapshot/custom-public-listener-ignores-tls
=== RUN   TestListenersFromSnapshot/custom-upstream
=== RUN   TestListenersFromSnapshot/custom-upstream-typed
--- PASS: TestListenersFromSnapshot (0.60s)
    --- PASS: TestListenersFromSnapshot/defaults (0.20s)
    --- PASS: TestListenersFromSnapshot/http-public-listener (0.09s)
    --- PASS: TestListenersFromSnapshot/http-upstream (0.06s)
    --- PASS: TestListenersFromSnapshot/custom-public-listener (0.06s)
    --- PASS: TestListenersFromSnapshot/custom-public-listener-typed (0.04s)
    --- PASS: TestListenersFromSnapshot/custom-public-listener-ignores-tls (0.05s)
    --- PASS: TestListenersFromSnapshot/custom-upstream (0.05s)
    --- PASS: TestListenersFromSnapshot/custom-upstream-typed (0.05s)
=== RUN   TestServer_StreamAggregatedResources_BasicProtocol
--- PASS: TestServer_StreamAggregatedResources_BasicProtocol (0.19s)
=== RUN   TestServer_StreamAggregatedResources_ACLEnforcement
--- SKIP: TestServer_StreamAggregatedResources_ACLEnforcement (0.00s)
    server_test.go:347: DM-skipped
=== RUN   TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedDuringDiscoveryRequest
2019/12/06 06:01:45 [DEBUG] Error handling ADS stream: rpc error: code = Unauthenticated desc = unauthenticated: ACL not found
--- PASS: TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedDuringDiscoveryRequest (0.06s)
=== RUN   TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedInBackground
2019/12/06 06:01:45 [DEBUG] Error handling ADS stream: rpc error: code = Unauthenticated desc = unauthenticated: ACL not found
--- PASS: TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedInBackground (0.16s)
=== RUN   TestServer_Check
=== RUN   TestServer_Check/auth_allowed
2019/12/06 06:01:45 [DEBUG] grpc: Connect AuthZ ALLOWED: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db reason=default allow
=== RUN   TestServer_Check/auth_denied
2019/12/06 06:01:45 [DEBUG] grpc: Connect AuthZ DENIED: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db reason=default deny
=== RUN   TestServer_Check/no_source
=== RUN   TestServer_Check/no_dest
=== RUN   TestServer_Check/dest_invalid_format
2019/12/06 06:01:45 [DEBUG] grpc: Connect AuthZ DENIED: bad destination URI: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=not-a-spiffe-id
=== RUN   TestServer_Check/dest_not_a_service_URI
2019/12/06 06:01:45 [DEBUG] grpc: Connect AuthZ DENIED: bad destination service ID: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://trust-domain.consul
=== RUN   TestServer_Check/ACL_not_got_permission_for_authz_call
2019/12/06 06:01:45 [DEBUG] grpc: Connect AuthZ failed ACL check: Permission denied: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db
=== RUN   TestServer_Check/Random_error_running_authz
2019/12/06 06:01:45 [DEBUG] grpc: Connect AuthZ failed: gremlin attack: src=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/web dest=spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db
--- PASS: TestServer_Check (0.01s)
    --- PASS: TestServer_Check/auth_allowed (0.00s)
    --- PASS: TestServer_Check/auth_denied (0.00s)
    --- PASS: TestServer_Check/no_source (0.00s)
    --- PASS: TestServer_Check/no_dest (0.00s)
    --- PASS: TestServer_Check/dest_invalid_format (0.00s)
    --- PASS: TestServer_Check/dest_not_a_service_URI (0.00s)
    --- PASS: TestServer_Check/ACL_not_got_permission_for_authz_call (0.00s)
    --- PASS: TestServer_Check/Random_error_running_authz (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/xds	2.038s
?   	github.com/hashicorp/consul/command	[no test files]
?   	github.com/hashicorp/consul/command/acl	[no test files]
=== RUN   TestAgentTokensCommand_noTabs
=== PAUSE TestAgentTokensCommand_noTabs
=== RUN   TestAgentTokensCommand
=== PAUSE TestAgentTokensCommand
=== CONT  TestAgentTokensCommand_noTabs
=== CONT  TestAgentTokensCommand
--- PASS: TestAgentTokensCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgentTokensCommand - 2019/12/06 06:01:56.453009 [WARN] agent: Node name "Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentTokensCommand - 2019/12/06 06:01:56.454044 [DEBUG] tlsutil: Update with version 1
TestAgentTokensCommand - 2019/12/06 06:01:56.460479 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:01:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b Address:127.0.0.1:17506}]
2019/12/06 06:01:58 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestAgentTokensCommand - 2019/12/06 06:01:58.247408 [INFO] serf: EventMemberJoin: Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b.dc1 127.0.0.1
TestAgentTokensCommand - 2019/12/06 06:01:58.259455 [INFO] serf: EventMemberJoin: Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b 127.0.0.1
TestAgentTokensCommand - 2019/12/06 06:01:58.263498 [INFO] consul: Adding LAN server Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestAgentTokensCommand - 2019/12/06 06:01:58.263968 [INFO] consul: Handled member-join event for server "Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b.dc1" in area "wan"
TestAgentTokensCommand - 2019/12/06 06:01:58.269781 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestAgentTokensCommand - 2019/12/06 06:01:58.270532 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestAgentTokensCommand - 2019/12/06 06:01:58.272992 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestAgentTokensCommand - 2019/12/06 06:01:58.273167 [INFO] agent: started state syncer
2019/12/06 06:01:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:01:58 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/12/06 06:01:58 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:01:58 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestAgentTokensCommand - 2019/12/06 06:01:58.867337 [INFO] consul: cluster leadership acquired
TestAgentTokensCommand - 2019/12/06 06:01:58.868012 [INFO] consul: New leader elected: Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b
TestAgentTokensCommand - 2019/12/06 06:01:59.007006 [ERR] agent: failed to sync remote state: ACL not found
TestAgentTokensCommand - 2019/12/06 06:01:59.015998 [INFO] acl: initializing acls
TestAgentTokensCommand - 2019/12/06 06:01:59.184133 [INFO] acl: initializing acls
TestAgentTokensCommand - 2019/12/06 06:01:59.184467 [INFO] consul: Created ACL 'global-management' policy
TestAgentTokensCommand - 2019/12/06 06:01:59.184530 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentTokensCommand - 2019/12/06 06:01:59.587211 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentTokensCommand - 2019/12/06 06:01:59.588851 [INFO] consul: Created ACL 'global-management' policy
TestAgentTokensCommand - 2019/12/06 06:01:59.588937 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentTokensCommand - 2019/12/06 06:01:59.760171 [INFO] consul: Created ACL anonymous token from configuration
TestAgentTokensCommand - 2019/12/06 06:01:59.760300 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentTokensCommand - 2019/12/06 06:01:59.761225 [INFO] serf: EventMemberUpdate: Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b
TestAgentTokensCommand - 2019/12/06 06:01:59.761996 [INFO] serf: EventMemberUpdate: Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b.dc1
TestAgentTokensCommand - 2019/12/06 06:01:59.926763 [INFO] consul: Created ACL anonymous token from configuration
TestAgentTokensCommand - 2019/12/06 06:01:59.927600 [INFO] serf: EventMemberUpdate: Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b
TestAgentTokensCommand - 2019/12/06 06:01:59.928222 [INFO] serf: EventMemberUpdate: Node 1ff2b3f6-45ce-ef4e-93d0-c9ff8ef3da3b.dc1
TestAgentTokensCommand - 2019/12/06 06:02:01.180585 [INFO] agent: Synced node info
TestAgentTokensCommand - 2019/12/06 06:02:01.180718 [DEBUG] agent: Node info in sync
TestAgentTokensCommand - 2019/12/06 06:02:01.529313 [DEBUG] http: Request PUT /v1/acl/token (286.933645ms) from=127.0.0.1:42846
TestAgentTokensCommand - 2019/12/06 06:02:01.535791 [INFO] agent: Updated agent's ACL token "default"
TestAgentTokensCommand - 2019/12/06 06:02:01.535912 [DEBUG] http: Request PUT /v1/agent/token/default (749.35µs) from=127.0.0.1:42848
TestAgentTokensCommand - 2019/12/06 06:02:01.540032 [INFO] agent: Updated agent's ACL token "agent"
TestAgentTokensCommand - 2019/12/06 06:02:01.540211 [DEBUG] http: Request PUT /v1/agent/token/agent (1.438033ms) from=127.0.0.1:42850
TestAgentTokensCommand - 2019/12/06 06:02:01.544342 [INFO] agent: Updated agent's ACL token "agent_master"
TestAgentTokensCommand - 2019/12/06 06:02:01.544428 [DEBUG] http: Request PUT /v1/agent/token/agent_master (726.017µs) from=127.0.0.1:42852
TestAgentTokensCommand - 2019/12/06 06:02:01.547953 [INFO] agent: Updated agent's ACL token "replication"
TestAgentTokensCommand - 2019/12/06 06:02:01.548048 [DEBUG] http: Request PUT /v1/agent/token/replication (536.679µs) from=127.0.0.1:42854
TestAgentTokensCommand - 2019/12/06 06:02:01.549067 [INFO] agent: Requesting shutdown
TestAgentTokensCommand - 2019/12/06 06:02:01.549143 [INFO] consul: shutting down server
TestAgentTokensCommand - 2019/12/06 06:02:01.549280 [WARN] serf: Shutdown without a Leave
TestAgentTokensCommand - 2019/12/06 06:02:01.683602 [WARN] serf: Shutdown without a Leave
TestAgentTokensCommand - 2019/12/06 06:02:01.775241 [INFO] manager: shutting down
TestAgentTokensCommand - 2019/12/06 06:02:01.779547 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestAgentTokensCommand - 2019/12/06 06:02:01.779700 [INFO] agent: consul server down
TestAgentTokensCommand - 2019/12/06 06:02:01.780050 [INFO] agent: shutdown complete
TestAgentTokensCommand - 2019/12/06 06:02:01.780112 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestAgentTokensCommand - 2019/12/06 06:02:01.780269 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestAgentTokensCommand - 2019/12/06 06:02:01.780916 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestAgentTokensCommand - 2019/12/06 06:02:01.782272 [INFO] agent: Waiting for endpoints to shut down
TestAgentTokensCommand - 2019/12/06 06:02:01.782415 [INFO] agent: Endpoints down
--- PASS: TestAgentTokensCommand (5.41s)
PASS
ok  	github.com/hashicorp/consul/command/acl/agenttokens	5.650s
?   	github.com/hashicorp/consul/command/acl/authmethod	[no test files]
=== RUN   TestAuthMethodCreateCommand_noTabs
=== PAUSE TestAuthMethodCreateCommand_noTabs
=== RUN   TestAuthMethodCreateCommand
=== PAUSE TestAuthMethodCreateCommand
=== RUN   TestAuthMethodCreateCommand_k8s
=== PAUSE TestAuthMethodCreateCommand_k8s
=== CONT  TestAuthMethodCreateCommand_noTabs
=== CONT  TestAuthMethodCreateCommand_k8s
=== CONT  TestAuthMethodCreateCommand
--- PASS: TestAuthMethodCreateCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:08.216263 [WARN] agent: Node name "Node 6891c055-2176-eb77-378a-3876ad5d6b8f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:08.218055 [DEBUG] tlsutil: Update with version 1
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:08.228964 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodCreateCommand - 2019/12/06 06:02:08.278329 [WARN] agent: Node name "Node 682f53fd-6139-8c22-8833-dafbfb5f57a5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodCreateCommand - 2019/12/06 06:02:08.278825 [DEBUG] tlsutil: Update with version 1
TestAuthMethodCreateCommand - 2019/12/06 06:02:08.290990 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:682f53fd-6139-8c22-8833-dafbfb5f57a5 Address:127.0.0.1:10012}]
2019/12/06 06:02:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6891c055-2176-eb77-378a-3876ad5d6b8f Address:127.0.0.1:10006}]
2019/12/06 06:02:09 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:09.455408 [INFO] serf: EventMemberJoin: Node 6891c055-2176-eb77-378a-3876ad5d6b8f.dc1 127.0.0.1
TestAuthMethodCreateCommand - 2019/12/06 06:02:09.455875 [INFO] serf: EventMemberJoin: Node 682f53fd-6139-8c22-8833-dafbfb5f57a5.dc1 127.0.0.1
2019/12/06 06:02:09 [INFO]  raft: Node at 127.0.0.1:10012 [Follower] entering Follower state (Leader: "")
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:09.464818 [INFO] serf: EventMemberJoin: Node 6891c055-2176-eb77-378a-3876ad5d6b8f 127.0.0.1
TestAuthMethodCreateCommand - 2019/12/06 06:02:09.466014 [INFO] serf: EventMemberJoin: Node 682f53fd-6139-8c22-8833-dafbfb5f57a5 127.0.0.1
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:09.467385 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestAuthMethodCreateCommand - 2019/12/06 06:02:09.467471 [INFO] agent: Started DNS server 127.0.0.1:10007 (udp)
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:09.468103 [INFO] consul: Adding LAN server Node 6891c055-2176-eb77-378a-3876ad5d6b8f (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestAuthMethodCreateCommand - 2019/12/06 06:02:09.468402 [INFO] consul: Adding LAN server Node 682f53fd-6139-8c22-8833-dafbfb5f57a5 (Addr: tcp/127.0.0.1:10012) (DC: dc1)
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:09.468455 [INFO] consul: Handled member-join event for server "Node 6891c055-2176-eb77-378a-3876ad5d6b8f.dc1" in area "wan"
TestAuthMethodCreateCommand - 2019/12/06 06:02:09.468620 [INFO] consul: Handled member-join event for server "Node 682f53fd-6139-8c22-8833-dafbfb5f57a5.dc1" in area "wan"
TestAuthMethodCreateCommand - 2019/12/06 06:02:09.469090 [INFO] agent: Started DNS server 127.0.0.1:10007 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:09.469091 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:09.471995 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:09.472309 [INFO] agent: started state syncer
TestAuthMethodCreateCommand - 2019/12/06 06:02:09.475063 [INFO] agent: Started HTTP server on 127.0.0.1:10008 (tcp)
TestAuthMethodCreateCommand - 2019/12/06 06:02:09.475607 [INFO] agent: started state syncer
2019/12/06 06:02:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:09 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:09 [INFO]  raft: Node at 127.0.0.1:10012 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:10 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.100782 [INFO] consul: cluster leadership acquired
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.101339 [INFO] consul: New leader elected: Node 6891c055-2176-eb77-378a-3876ad5d6b8f
2019/12/06 06:02:10 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:10 [INFO]  raft: Node at 127.0.0.1:10012 [Leader] entering Leader state
TestAuthMethodCreateCommand - 2019/12/06 06:02:10.102649 [INFO] consul: cluster leadership acquired
TestAuthMethodCreateCommand - 2019/12/06 06:02:10.103077 [INFO] consul: New leader elected: Node 682f53fd-6139-8c22-8833-dafbfb5f57a5
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.199723 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.220381 [INFO] acl: initializing acls
TestAuthMethodCreateCommand - 2019/12/06 06:02:10.222815 [INFO] acl: initializing acls
TestAuthMethodCreateCommand - 2019/12/06 06:02:10.320947 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.551753 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.551853 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.553610 [INFO] acl: initializing acls
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.553732 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodCreateCommand - 2019/12/06 06:02:10.554662 [INFO] acl: initializing acls
TestAuthMethodCreateCommand - 2019/12/06 06:02:10.555015 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodCreateCommand - 2019/12/06 06:02:10.555085 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodCreateCommand - 2019/12/06 06:02:10.836287 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:10.839063 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:11.187079 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.189318 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.189528 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:11.353631 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:11.353767 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:11.354705 [INFO] serf: EventMemberUpdate: Node 6891c055-2176-eb77-378a-3876ad5d6b8f
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:11.355465 [INFO] serf: EventMemberUpdate: Node 6891c055-2176-eb77-378a-3876ad5d6b8f.dc1
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:11.534835 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.535763 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:11.536366 [INFO] serf: EventMemberUpdate: Node 6891c055-2176-eb77-378a-3876ad5d6b8f
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.536609 [INFO] serf: EventMemberUpdate: Node 682f53fd-6139-8c22-8833-dafbfb5f57a5
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.537220 [INFO] serf: EventMemberUpdate: Node 682f53fd-6139-8c22-8833-dafbfb5f57a5.dc1
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.537712 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.538035 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.541244 [INFO] serf: EventMemberUpdate: Node 682f53fd-6139-8c22-8833-dafbfb5f57a5
TestAuthMethodCreateCommand - 2019/12/06 06:02:11.541922 [INFO] serf: EventMemberUpdate: Node 682f53fd-6139-8c22-8833-dafbfb5f57a5.dc1
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:11.545461 [INFO] serf: EventMemberUpdate: Node 6891c055-2176-eb77-378a-3876ad5d6b8f.dc1
TestAuthMethodCreateCommand - 2019/12/06 06:02:12.234812 [INFO] agent: Synced node info
TestAuthMethodCreateCommand - 2019/12/06 06:02:12.234930 [DEBUG] agent: Node info in sync
=== RUN   TestAuthMethodCreateCommand/type_required
=== RUN   TestAuthMethodCreateCommand/name_required
=== RUN   TestAuthMethodCreateCommand/invalid_type
TestAuthMethodCreateCommand - 2019/12/06 06:02:12.276705 [ERR] http: Request PUT /v1/acl/auth-method, error: Invalid Auth Method: Type should be one of: [kubernetes testing] from=127.0.0.1:36478
TestAuthMethodCreateCommand - 2019/12/06 06:02:12.281273 [DEBUG] http: Request PUT /v1/acl/auth-method (8.241858ms) from=127.0.0.1:36478
=== RUN   TestAuthMethodCreateCommand/create_testing
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:12.917945 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:12.918506 [DEBUG] consul: Skipping self join check for "Node 6891c055-2176-eb77-378a-3876ad5d6b8f" since the cluster is too small
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:12.918617 [INFO] consul: member 'Node 6891c055-2176-eb77-378a-3876ad5d6b8f' joined, marking health alive
TestAuthMethodCreateCommand - 2019/12/06 06:02:12.926874 [DEBUG] http: Request PUT /v1/acl/auth-method (615.280583ms) from=127.0.0.1:36480
TestAuthMethodCreateCommand - 2019/12/06 06:02:12.929659 [INFO] agent: Requesting shutdown
TestAuthMethodCreateCommand - 2019/12/06 06:02:12.929748 [INFO] consul: shutting down server
TestAuthMethodCreateCommand - 2019/12/06 06:02:12.929805 [WARN] serf: Shutdown without a Leave
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.118734 [WARN] serf: Shutdown without a Leave
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.210497 [INFO] manager: shutting down
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.210723 [ERR] connect: Apply failed leadership lost while committing log
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.211176 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.211046 [INFO] agent: consul server down
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.211323 [INFO] agent: shutdown complete
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.211371 [INFO] agent: Stopping DNS server 127.0.0.1:10007 (tcp)
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.211509 [INFO] agent: Stopping DNS server 127.0.0.1:10007 (udp)
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.211652 [INFO] agent: Stopping HTTP server 127.0.0.1:10008 (tcp)
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.212707 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.212730 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodCreateCommand - 2019/12/06 06:02:13.212826 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodCreateCommand (5.09s)
    --- PASS: TestAuthMethodCreateCommand/type_required (0.01s)
    --- PASS: TestAuthMethodCreateCommand/name_required (0.00s)
    --- PASS: TestAuthMethodCreateCommand/invalid_type (0.03s)
    --- PASS: TestAuthMethodCreateCommand/create_testing (0.64s)
=== RUN   TestAuthMethodCreateCommand_k8s/k8s_host_required
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:13.546404 [INFO] agent: Synced node info
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:13.546752 [DEBUG] agent: Node info in sync
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:13.547245 [DEBUG] consul: Skipping self join check for "Node 6891c055-2176-eb77-378a-3876ad5d6b8f" since the cluster is too small
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:13.548084 [DEBUG] consul: Skipping self join check for "Node 6891c055-2176-eb77-378a-3876ad5d6b8f" since the cluster is too small
=== RUN   TestAuthMethodCreateCommand_k8s/k8s_ca_cert_required
=== RUN   TestAuthMethodCreateCommand_k8s/k8s_jwt_required
=== RUN   TestAuthMethodCreateCommand_k8s/create_k8s
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:13.752964 [DEBUG] http: Request PUT /v1/acl/auth-method (180.836522ms) from=127.0.0.1:45868
=== RUN   TestAuthMethodCreateCommand_k8s/create_k8s_with_cert_file
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.206961 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.208240 [DEBUG] http: Request PUT /v1/acl/auth-method (440.704207ms) from=127.0.0.1:45870
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.225121 [INFO] agent: Requesting shutdown
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.225226 [INFO] consul: shutting down server
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.225546 [WARN] serf: Shutdown without a Leave
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.326909 [WARN] serf: Shutdown without a Leave
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.475446 [INFO] manager: shutting down
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.476301 [INFO] agent: consul server down
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.476387 [INFO] agent: shutdown complete
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.476453 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.476602 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.476744 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.477327 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodCreateCommand_k8s - 2019/12/06 06:02:14.477441 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodCreateCommand_k8s (6.36s)
    --- PASS: TestAuthMethodCreateCommand_k8s/k8s_host_required (0.00s)
    --- PASS: TestAuthMethodCreateCommand_k8s/k8s_ca_cert_required (0.00s)
    --- PASS: TestAuthMethodCreateCommand_k8s/k8s_jwt_required (0.00s)
    --- PASS: TestAuthMethodCreateCommand_k8s/create_k8s (0.19s)
    --- PASS: TestAuthMethodCreateCommand_k8s/create_k8s_with_cert_file (0.46s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/create	6.635s
=== RUN   TestAuthMethodDeleteCommand_noTabs
=== PAUSE TestAuthMethodDeleteCommand_noTabs
=== RUN   TestAuthMethodDeleteCommand
=== PAUSE TestAuthMethodDeleteCommand
=== CONT  TestAuthMethodDeleteCommand_noTabs
=== CONT  TestAuthMethodDeleteCommand
--- PASS: TestAuthMethodDeleteCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodDeleteCommand - 2019/12/06 06:02:30.141633 [WARN] agent: Node name "Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodDeleteCommand - 2019/12/06 06:02:30.142714 [DEBUG] tlsutil: Update with version 1
TestAuthMethodDeleteCommand - 2019/12/06 06:02:30.163934 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9dbd406b-3f3c-fcf0-5c9f-4f75649033d6 Address:127.0.0.1:41506}]
2019/12/06 06:02:31 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
TestAuthMethodDeleteCommand - 2019/12/06 06:02:31.518605 [INFO] serf: EventMemberJoin: Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6.dc1 127.0.0.1
TestAuthMethodDeleteCommand - 2019/12/06 06:02:31.524415 [INFO] serf: EventMemberJoin: Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6 127.0.0.1
TestAuthMethodDeleteCommand - 2019/12/06 06:02:31.525783 [INFO] consul: Adding LAN server Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6 (Addr: tcp/127.0.0.1:41506) (DC: dc1)
TestAuthMethodDeleteCommand - 2019/12/06 06:02:31.527119 [INFO] consul: Handled member-join event for server "Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6.dc1" in area "wan"
TestAuthMethodDeleteCommand - 2019/12/06 06:02:31.529516 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
TestAuthMethodDeleteCommand - 2019/12/06 06:02:31.530483 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
TestAuthMethodDeleteCommand - 2019/12/06 06:02:31.533518 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
TestAuthMethodDeleteCommand - 2019/12/06 06:02:31.533693 [INFO] agent: started state syncer
2019/12/06 06:02:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:31 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:32 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
TestAuthMethodDeleteCommand - 2019/12/06 06:02:32.667639 [INFO] consul: cluster leadership acquired
TestAuthMethodDeleteCommand - 2019/12/06 06:02:32.668147 [INFO] consul: New leader elected: Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6
TestAuthMethodDeleteCommand - 2019/12/06 06:02:32.722393 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodDeleteCommand - 2019/12/06 06:02:33.017548 [INFO] acl: initializing acls
TestAuthMethodDeleteCommand - 2019/12/06 06:02:33.081721 [INFO] acl: initializing acls
TestAuthMethodDeleteCommand - 2019/12/06 06:02:33.429967 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodDeleteCommand - 2019/12/06 06:02:33.430079 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodDeleteCommand - 2019/12/06 06:02:33.430380 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodDeleteCommand - 2019/12/06 06:02:33.430454 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.143760 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.144005 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.436772 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.436880 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.437733 [INFO] serf: EventMemberUpdate: Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.438405 [INFO] serf: EventMemberUpdate: Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6.dc1
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.677030 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.678055 [INFO] serf: EventMemberUpdate: Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6
TestAuthMethodDeleteCommand - 2019/12/06 06:02:34.678741 [INFO] serf: EventMemberUpdate: Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6.dc1
TestAuthMethodDeleteCommand - 2019/12/06 06:02:35.893922 [INFO] agent: Synced node info
TestAuthMethodDeleteCommand - 2019/12/06 06:02:35.894091 [DEBUG] agent: Node info in sync
=== RUN   TestAuthMethodDeleteCommand/name_required
=== RUN   TestAuthMethodDeleteCommand/delete_notfound
TestAuthMethodDeleteCommand - 2019/12/06 06:02:35.916070 [DEBUG] http: Request DELETE /v1/acl/auth-method/notfound (4.592773ms) from=127.0.0.1:36000
=== RUN   TestAuthMethodDeleteCommand/delete_works
TestAuthMethodDeleteCommand - 2019/12/06 06:02:36.968112 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodDeleteCommand - 2019/12/06 06:02:36.970457 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodDeleteCommand - 2019/12/06 06:02:36.971029 [DEBUG] consul: Skipping self join check for "Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6" since the cluster is too small
TestAuthMethodDeleteCommand - 2019/12/06 06:02:36.971217 [INFO] consul: member 'Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6' joined, marking health alive
TestAuthMethodDeleteCommand - 2019/12/06 06:02:36.979593 [DEBUG] http: Request PUT /v1/acl/auth-method (1.05894086s) from=127.0.0.1:36002
TestAuthMethodDeleteCommand - 2019/12/06 06:02:37.968142 [WARN] consul: error getting server health from "Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6": context deadline exceeded
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.697701 [DEBUG] http: Request DELETE /v1/acl/auth-method/test-795912b0-7620-9e78-2b43-d801e467b34e (1.702572768s) from=127.0.0.1:36004
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.698847 [DEBUG] consul: Skipping self join check for "Node 9dbd406b-3f3c-fcf0-5c9f-4f75649033d6" since the cluster is too small
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.701275 [DEBUG] http: Request GET /v1/acl/auth-method/test-795912b0-7620-9e78-2b43-d801e467b34e (465.677µs) from=127.0.0.1:36002
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.702242 [INFO] agent: Requesting shutdown
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.702320 [INFO] consul: shutting down server
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.702367 [WARN] serf: Shutdown without a Leave
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.850733 [WARN] serf: Shutdown without a Leave
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.926153 [INFO] manager: shutting down
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.927126 [INFO] agent: consul server down
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.927196 [INFO] agent: shutdown complete
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.927257 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.927426 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.927695 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.928297 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodDeleteCommand - 2019/12/06 06:02:38.928415 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodDeleteCommand (8.87s)
    --- PASS: TestAuthMethodDeleteCommand/name_required (0.00s)
    --- PASS: TestAuthMethodDeleteCommand/delete_notfound (0.01s)
    --- PASS: TestAuthMethodDeleteCommand/delete_works (2.78s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/delete	9.145s
=== RUN   TestAuthMethodListCommand_noTabs
=== PAUSE TestAuthMethodListCommand_noTabs
=== RUN   TestAuthMethodListCommand
=== PAUSE TestAuthMethodListCommand
=== CONT  TestAuthMethodListCommand_noTabs
=== CONT  TestAuthMethodListCommand
--- PASS: TestAuthMethodListCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodListCommand - 2019/12/06 06:02:45.438340 [WARN] agent: Node name "Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodListCommand - 2019/12/06 06:02:45.439882 [DEBUG] tlsutil: Update with version 1
TestAuthMethodListCommand - 2019/12/06 06:02:45.452463 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383 Address:127.0.0.1:16006}]
2019/12/06 06:02:47 [INFO]  raft: Node at 127.0.0.1:16006 [Follower] entering Follower state (Leader: "")
TestAuthMethodListCommand - 2019/12/06 06:02:47.198096 [INFO] serf: EventMemberJoin: Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383.dc1 127.0.0.1
TestAuthMethodListCommand - 2019/12/06 06:02:47.201915 [INFO] serf: EventMemberJoin: Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383 127.0.0.1
TestAuthMethodListCommand - 2019/12/06 06:02:47.203182 [INFO] consul: Adding LAN server Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383 (Addr: tcp/127.0.0.1:16006) (DC: dc1)
TestAuthMethodListCommand - 2019/12/06 06:02:47.204114 [INFO] consul: Handled member-join event for server "Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383.dc1" in area "wan"
TestAuthMethodListCommand - 2019/12/06 06:02:47.204871 [INFO] agent: Started DNS server 127.0.0.1:16001 (udp)
TestAuthMethodListCommand - 2019/12/06 06:02:47.204953 [INFO] agent: Started DNS server 127.0.0.1:16001 (tcp)
TestAuthMethodListCommand - 2019/12/06 06:02:47.207926 [INFO] agent: Started HTTP server on 127.0.0.1:16002 (tcp)
TestAuthMethodListCommand - 2019/12/06 06:02:47.208413 [INFO] agent: started state syncer
2019/12/06 06:02:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:47 [INFO]  raft: Node at 127.0.0.1:16006 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:48 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:48 [INFO]  raft: Node at 127.0.0.1:16006 [Leader] entering Leader state
TestAuthMethodListCommand - 2019/12/06 06:02:48.935532 [INFO] consul: cluster leadership acquired
TestAuthMethodListCommand - 2019/12/06 06:02:48.935978 [INFO] consul: New leader elected: Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383
TestAuthMethodListCommand - 2019/12/06 06:02:49.017953 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodListCommand - 2019/12/06 06:02:49.377742 [INFO] acl: initializing acls
TestAuthMethodListCommand - 2019/12/06 06:02:49.610167 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodListCommand - 2019/12/06 06:02:49.610263 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodListCommand - 2019/12/06 06:02:49.993656 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodListCommand - 2019/12/06 06:02:50.035794 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodListCommand - 2019/12/06 06:02:50.335657 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodListCommand - 2019/12/06 06:02:50.336656 [INFO] serf: EventMemberUpdate: Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383
TestAuthMethodListCommand - 2019/12/06 06:02:50.337259 [INFO] serf: EventMemberUpdate: Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383.dc1
TestAuthMethodListCommand - 2019/12/06 06:02:51.737422 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodListCommand - 2019/12/06 06:02:51.737924 [DEBUG] consul: Skipping self join check for "Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383" since the cluster is too small
TestAuthMethodListCommand - 2019/12/06 06:02:51.738017 [INFO] consul: member 'Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383' joined, marking health alive
TestAuthMethodListCommand - 2019/12/06 06:02:51.928156 [DEBUG] consul: Skipping self join check for "Node a7231cd2-56ef-e0a8-2d6b-45ae6fcf0383" since the cluster is too small
=== RUN   TestAuthMethodListCommand/found_none
TestAuthMethodListCommand - 2019/12/06 06:02:51.961466 [DEBUG] http: Request GET /v1/acl/auth-methods (4.099428ms) from=127.0.0.1:59264
TestAuthMethodListCommand - 2019/12/06 06:02:52.320892 [DEBUG] http: Request PUT /v1/acl/auth-method (354.572546ms) from=127.0.0.1:59266
TestAuthMethodListCommand - 2019/12/06 06:02:52.601492 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodListCommand - 2019/12/06 06:02:52.606378 [DEBUG] http: Request PUT /v1/acl/auth-method (270.380596ms) from=127.0.0.1:59266
TestAuthMethodListCommand - 2019/12/06 06:02:52.895439 [DEBUG] http: Request PUT /v1/acl/auth-method (285.175938ms) from=127.0.0.1:59266
TestAuthMethodListCommand - 2019/12/06 06:02:53.237601 [DEBUG] http: Request PUT /v1/acl/auth-method (337.994828ms) from=127.0.0.1:59266
TestAuthMethodListCommand - 2019/12/06 06:02:53.570218 [DEBUG] http: Request PUT /v1/acl/auth-method (329.506632ms) from=127.0.0.1:59266
=== RUN   TestAuthMethodListCommand/found_some
TestAuthMethodListCommand - 2019/12/06 06:02:53.578728 [DEBUG] http: Request GET /v1/acl/auth-methods (1.514368ms) from=127.0.0.1:59268
TestAuthMethodListCommand - 2019/12/06 06:02:53.582460 [INFO] agent: Requesting shutdown
TestAuthMethodListCommand - 2019/12/06 06:02:53.582629 [INFO] consul: shutting down server
TestAuthMethodListCommand - 2019/12/06 06:02:53.583009 [WARN] serf: Shutdown without a Leave
TestAuthMethodListCommand - 2019/12/06 06:02:53.709318 [WARN] serf: Shutdown without a Leave
TestAuthMethodListCommand - 2019/12/06 06:02:53.809436 [INFO] manager: shutting down
TestAuthMethodListCommand - 2019/12/06 06:02:53.810329 [INFO] agent: consul server down
TestAuthMethodListCommand - 2019/12/06 06:02:53.810392 [INFO] agent: shutdown complete
TestAuthMethodListCommand - 2019/12/06 06:02:53.810445 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (tcp)
TestAuthMethodListCommand - 2019/12/06 06:02:53.810574 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (udp)
TestAuthMethodListCommand - 2019/12/06 06:02:53.810718 [INFO] agent: Stopping HTTP server 127.0.0.1:16002 (tcp)
TestAuthMethodListCommand - 2019/12/06 06:02:53.811499 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodListCommand - 2019/12/06 06:02:53.811623 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodListCommand (8.46s)
    --- PASS: TestAuthMethodListCommand/found_none (0.01s)
    --- PASS: TestAuthMethodListCommand/found_some (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/list	8.742s
=== RUN   TestAuthMethodReadCommand_noTabs
=== PAUSE TestAuthMethodReadCommand_noTabs
=== RUN   TestAuthMethodReadCommand
=== PAUSE TestAuthMethodReadCommand
=== CONT  TestAuthMethodReadCommand_noTabs
--- PASS: TestAuthMethodReadCommand_noTabs (0.00s)
=== CONT  TestAuthMethodReadCommand
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodReadCommand - 2019/12/06 06:02:55.777363 [WARN] agent: Node name "Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodReadCommand - 2019/12/06 06:02:55.779597 [DEBUG] tlsutil: Update with version 1
TestAuthMethodReadCommand - 2019/12/06 06:02:55.790436 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:02:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ffca46a1-2401-5f68-d734-5cd4cfdcab2e Address:127.0.0.1:32506}]
2019/12/06 06:02:57 [INFO]  raft: Node at 127.0.0.1:32506 [Follower] entering Follower state (Leader: "")
TestAuthMethodReadCommand - 2019/12/06 06:02:57.065795 [INFO] serf: EventMemberJoin: Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e.dc1 127.0.0.1
TestAuthMethodReadCommand - 2019/12/06 06:02:57.085868 [INFO] serf: EventMemberJoin: Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e 127.0.0.1
TestAuthMethodReadCommand - 2019/12/06 06:02:57.087043 [INFO] consul: Adding LAN server Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e (Addr: tcp/127.0.0.1:32506) (DC: dc1)
TestAuthMethodReadCommand - 2019/12/06 06:02:57.087256 [INFO] consul: Handled member-join event for server "Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e.dc1" in area "wan"
TestAuthMethodReadCommand - 2019/12/06 06:02:57.087767 [INFO] agent: Started DNS server 127.0.0.1:32501 (tcp)
TestAuthMethodReadCommand - 2019/12/06 06:02:57.087844 [INFO] agent: Started DNS server 127.0.0.1:32501 (udp)
TestAuthMethodReadCommand - 2019/12/06 06:02:57.091082 [INFO] agent: Started HTTP server on 127.0.0.1:32502 (tcp)
TestAuthMethodReadCommand - 2019/12/06 06:02:57.091251 [INFO] agent: started state syncer
2019/12/06 06:02:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:02:57 [INFO]  raft: Node at 127.0.0.1:32506 [Candidate] entering Candidate state in term 2
2019/12/06 06:02:57 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:02:57 [INFO]  raft: Node at 127.0.0.1:32506 [Leader] entering Leader state
TestAuthMethodReadCommand - 2019/12/06 06:02:57.909951 [INFO] consul: cluster leadership acquired
TestAuthMethodReadCommand - 2019/12/06 06:02:57.910943 [INFO] consul: New leader elected: Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e
TestAuthMethodReadCommand - 2019/12/06 06:02:57.943812 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodReadCommand - 2019/12/06 06:02:58.393511 [INFO] acl: initializing acls
TestAuthMethodReadCommand - 2019/12/06 06:02:58.610514 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodReadCommand - 2019/12/06 06:02:58.610602 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodReadCommand - 2019/12/06 06:02:58.640758 [INFO] acl: initializing acls
TestAuthMethodReadCommand - 2019/12/06 06:02:58.641106 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodReadCommand - 2019/12/06 06:02:58.893931 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodReadCommand - 2019/12/06 06:02:59.277193 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodReadCommand - 2019/12/06 06:02:59.278072 [INFO] serf: EventMemberUpdate: Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e
TestAuthMethodReadCommand - 2019/12/06 06:02:59.278655 [INFO] serf: EventMemberUpdate: Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e.dc1
TestAuthMethodReadCommand - 2019/12/06 06:02:59.279354 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodReadCommand - 2019/12/06 06:02:59.279544 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodReadCommand - 2019/12/06 06:02:59.280390 [INFO] serf: EventMemberUpdate: Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e
TestAuthMethodReadCommand - 2019/12/06 06:02:59.281055 [INFO] serf: EventMemberUpdate: Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e.dc1
TestAuthMethodReadCommand - 2019/12/06 06:03:00.168339 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodReadCommand - 2019/12/06 06:03:00.168814 [DEBUG] consul: Skipping self join check for "Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e" since the cluster is too small
TestAuthMethodReadCommand - 2019/12/06 06:03:00.168916 [INFO] consul: member 'Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e' joined, marking health alive
TestAuthMethodReadCommand - 2019/12/06 06:03:00.337201 [DEBUG] consul: Skipping self join check for "Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e" since the cluster is too small
TestAuthMethodReadCommand - 2019/12/06 06:03:00.337705 [DEBUG] consul: Skipping self join check for "Node ffca46a1-2401-5f68-d734-5cd4cfdcab2e" since the cluster is too small
=== RUN   TestAuthMethodReadCommand/name_required
=== RUN   TestAuthMethodReadCommand/not_found
TestAuthMethodReadCommand - 2019/12/06 06:03:00.368147 [DEBUG] http: Request GET /v1/acl/auth-method/notfound (2.846066ms) from=127.0.0.1:47392
=== RUN   TestAuthMethodReadCommand/read_by_name
TestAuthMethodReadCommand - 2019/12/06 06:03:00.585250 [DEBUG] http: Request PUT /v1/acl/auth-method (212.973933ms) from=127.0.0.1:47394
TestAuthMethodReadCommand - 2019/12/06 06:03:00.595932 [DEBUG] http: Request GET /v1/acl/auth-method/test-40b40e34-5add-2e8e-4b04-675eec5d3ca1 (1.716373ms) from=127.0.0.1:47396
TestAuthMethodReadCommand - 2019/12/06 06:03:00.606044 [INFO] agent: Requesting shutdown
TestAuthMethodReadCommand - 2019/12/06 06:03:00.606292 [INFO] consul: shutting down server
TestAuthMethodReadCommand - 2019/12/06 06:03:00.606438 [WARN] serf: Shutdown without a Leave
TestAuthMethodReadCommand - 2019/12/06 06:03:00.667682 [WARN] serf: Shutdown without a Leave
TestAuthMethodReadCommand - 2019/12/06 06:03:00.726088 [INFO] manager: shutting down
TestAuthMethodReadCommand - 2019/12/06 06:03:00.726604 [INFO] agent: consul server down
TestAuthMethodReadCommand - 2019/12/06 06:03:00.726666 [INFO] agent: shutdown complete
TestAuthMethodReadCommand - 2019/12/06 06:03:00.726723 [INFO] agent: Stopping DNS server 127.0.0.1:32501 (tcp)
TestAuthMethodReadCommand - 2019/12/06 06:03:00.726859 [INFO] agent: Stopping DNS server 127.0.0.1:32501 (udp)
TestAuthMethodReadCommand - 2019/12/06 06:03:00.727014 [INFO] agent: Stopping HTTP server 127.0.0.1:32502 (tcp)
TestAuthMethodReadCommand - 2019/12/06 06:03:00.727864 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodReadCommand - 2019/12/06 06:03:00.728063 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodReadCommand (5.05s)
    --- PASS: TestAuthMethodReadCommand/name_required (0.00s)
    --- PASS: TestAuthMethodReadCommand/not_found (0.02s)
    --- PASS: TestAuthMethodReadCommand/read_by_name (0.24s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/read	5.355s
=== RUN   TestAuthMethodUpdateCommand_noTabs
=== PAUSE TestAuthMethodUpdateCommand_noTabs
=== RUN   TestAuthMethodUpdateCommand
=== PAUSE TestAuthMethodUpdateCommand
=== RUN   TestAuthMethodUpdateCommand_noMerge
=== PAUSE TestAuthMethodUpdateCommand_noMerge
=== RUN   TestAuthMethodUpdateCommand_k8s
=== PAUSE TestAuthMethodUpdateCommand_k8s
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge
=== PAUSE TestAuthMethodUpdateCommand_k8s_noMerge
=== CONT  TestAuthMethodUpdateCommand_k8s_noMerge
=== CONT  TestAuthMethodUpdateCommand_noMerge
=== CONT  TestAuthMethodUpdateCommand
=== CONT  TestAuthMethodUpdateCommand_noTabs
=== CONT  TestAuthMethodUpdateCommand_k8s
--- PASS: TestAuthMethodUpdateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:33.746425 [WARN] agent: Node name "Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:33.747373 [DEBUG] tlsutil: Update with version 1
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:33.756145 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:33.767290 [WARN] agent: Node name "Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:33.767737 [DEBUG] tlsutil: Update with version 1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:33.793003 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodUpdateCommand - 2019/12/06 06:03:33.794455 [WARN] agent: Node name "Node d0af5be7-023a-21c5-586c-c90bce328081" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodUpdateCommand - 2019/12/06 06:03:33.796818 [DEBUG] tlsutil: Update with version 1
TestAuthMethodUpdateCommand - 2019/12/06 06:03:33.801264 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:33.823328 [WARN] agent: Node name "Node fcbd233f-d69b-b621-92eb-3f39066375f4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:33.823919 [DEBUG] tlsutil: Update with version 1
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:33.829400 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fcbd233f-d69b-b621-92eb-3f39066375f4 Address:127.0.0.1:53524}]
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:53524 [Follower] entering Follower state (Leader: "")
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:35.237602 [INFO] serf: EventMemberJoin: Node fcbd233f-d69b-b621-92eb-3f39066375f4.dc1 127.0.0.1
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:35.241092 [INFO] serf: EventMemberJoin: Node fcbd233f-d69b-b621-92eb-3f39066375f4 127.0.0.1
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:35.242655 [INFO] consul: Handled member-join event for server "Node fcbd233f-d69b-b621-92eb-3f39066375f4.dc1" in area "wan"
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:35.242926 [INFO] consul: Adding LAN server Node fcbd233f-d69b-b621-92eb-3f39066375f4 (Addr: tcp/127.0.0.1:53524) (DC: dc1)
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:35.244080 [INFO] agent: Started DNS server 127.0.0.1:53519 (udp)
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:35.244225 [INFO] agent: Started DNS server 127.0.0.1:53519 (tcp)
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:35.253103 [INFO] agent: Started HTTP server on 127.0.0.1:53520 (tcp)
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:35.253274 [INFO] agent: started state syncer
2019/12/06 06:03:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:53524 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d9c294a3-e60d-0bd1-fbbe-26c9f4794492 Address:127.0.0.1:53506}]
2019/12/06 06:03:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4ff02853-3203-cef1-3fdc-c5cea0f0a6e7 Address:127.0.0.1:53512}]
2019/12/06 06:03:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d0af5be7-023a-21c5-586c-c90bce328081 Address:127.0.0.1:53518}]
TestAuthMethodUpdateCommand - 2019/12/06 06:03:35.367443 [INFO] serf: EventMemberJoin: Node d0af5be7-023a-21c5-586c-c90bce328081.dc1 127.0.0.1
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:53506 [Follower] entering Follower state (Leader: "")
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:35.372244 [INFO] serf: EventMemberJoin: Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492.dc1 127.0.0.1
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:35.377580 [INFO] serf: EventMemberJoin: Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7.dc1 127.0.0.1
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:53518 [Follower] entering Follower state (Leader: "")
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:53512 [Follower] entering Follower state (Leader: "")
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:35.406496 [INFO] serf: EventMemberJoin: Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492 127.0.0.1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:35.407862 [INFO] agent: Started DNS server 127.0.0.1:53501 (udp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:35.410928 [INFO] serf: EventMemberJoin: Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7 127.0.0.1
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:35.412218 [INFO] agent: Started DNS server 127.0.0.1:53507 (udp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:35.413141 [INFO] consul: Adding LAN server Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7 (Addr: tcp/127.0.0.1:53512) (DC: dc1)
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:35.413359 [INFO] consul: Handled member-join event for server "Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7.dc1" in area "wan"
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:35.413873 [INFO] agent: Started DNS server 127.0.0.1:53507 (tcp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:35.414816 [INFO] consul: Adding LAN server Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492 (Addr: tcp/127.0.0.1:53506) (DC: dc1)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:35.415498 [INFO] consul: Handled member-join event for server "Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492.dc1" in area "wan"
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:35.417411 [INFO] agent: Started DNS server 127.0.0.1:53501 (tcp)
TestAuthMethodUpdateCommand - 2019/12/06 06:03:35.418235 [INFO] serf: EventMemberJoin: Node d0af5be7-023a-21c5-586c-c90bce328081 127.0.0.1
TestAuthMethodUpdateCommand - 2019/12/06 06:03:35.422953 [INFO] agent: Started DNS server 127.0.0.1:53513 (udp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:35.426987 [INFO] agent: Started HTTP server on 127.0.0.1:53502 (tcp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:35.437493 [INFO] agent: started state syncer
TestAuthMethodUpdateCommand - 2019/12/06 06:03:35.454367 [INFO] consul: Adding LAN server Node d0af5be7-023a-21c5-586c-c90bce328081 (Addr: tcp/127.0.0.1:53518) (DC: dc1)
2019/12/06 06:03:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:53506 [Candidate] entering Candidate state in term 2
TestAuthMethodUpdateCommand - 2019/12/06 06:03:35.467786 [INFO] consul: Handled member-join event for server "Node d0af5be7-023a-21c5-586c-c90bce328081.dc1" in area "wan"
2019/12/06 06:03:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:53518 [Candidate] entering Candidate state in term 2
TestAuthMethodUpdateCommand - 2019/12/06 06:03:35.458518 [INFO] agent: Started DNS server 127.0.0.1:53513 (tcp)
2019/12/06 06:03:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:35 [INFO]  raft: Node at 127.0.0.1:53512 [Candidate] entering Candidate state in term 2
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:35.523774 [INFO] agent: Started HTTP server on 127.0.0.1:53508 (tcp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:35.523933 [INFO] agent: started state syncer
TestAuthMethodUpdateCommand - 2019/12/06 06:03:35.541997 [INFO] agent: Started HTTP server on 127.0.0.1:53514 (tcp)
TestAuthMethodUpdateCommand - 2019/12/06 06:03:35.542318 [INFO] agent: started state syncer
2019/12/06 06:03:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:53524 [Leader] entering Leader state
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:36.230660 [INFO] consul: cluster leadership acquired
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:36.231173 [INFO] consul: New leader elected: Node fcbd233f-d69b-b621-92eb-3f39066375f4
2019/12/06 06:03:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:53512 [Leader] entering Leader state
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:36.353406 [INFO] consul: cluster leadership acquired
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:36.353857 [INFO] consul: New leader elected: Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7
2019/12/06 06:03:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:53506 [Leader] entering Leader state
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:36.356054 [INFO] consul: cluster leadership acquired
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:36.356452 [INFO] consul: New leader elected: Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492
2019/12/06 06:03:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:36 [INFO]  raft: Node at 127.0.0.1:53518 [Leader] entering Leader state
TestAuthMethodUpdateCommand - 2019/12/06 06:03:36.361478 [INFO] consul: cluster leadership acquired
TestAuthMethodUpdateCommand - 2019/12/06 06:03:36.361857 [INFO] consul: New leader elected: Node d0af5be7-023a-21c5-586c-c90bce328081
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:36.413654 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:36.430778 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand - 2019/12/06 06:03:36.507042 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:36.699372 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:36.714941 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:36.794891 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:36.818598 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand - 2019/12/06 06:03:36.827619 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:36.830583 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:36.942693 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:36.942819 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:36.975390 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:36.977315 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand - 2019/12/06 06:03:37.010207 [INFO] acl: initializing acls
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:37.024556 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:37.069033 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:37.069141 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand - 2019/12/06 06:03:37.257875 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand - 2019/12/06 06:03:37.258025 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:37.263960 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:37.264100 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand - 2019/12/06 06:03:37.267319 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand - 2019/12/06 06:03:37.267444 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:37.268107 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:37.268210 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:37.690438 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:37.695748 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:37.695840 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:37.899117 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:37.901791 [INFO] consul: Created ACL 'global-management' policy
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:37.901883 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:37.901912 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:37.996787 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:37.997758 [INFO] serf: EventMemberUpdate: Node fcbd233f-d69b-b621-92eb-3f39066375f4
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:38.001167 [INFO] serf: EventMemberUpdate: Node fcbd233f-d69b-b621-92eb-3f39066375f4.dc1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.108256 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.117275 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.117482 [INFO] consul: Bootstrapped ACL master token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.279375 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.279794 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.279859 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.280259 [INFO] serf: EventMemberUpdate: Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.280429 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.280898 [INFO] serf: EventMemberUpdate: Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7.dc1
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.280927 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.280979 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.281197 [INFO] serf: EventMemberUpdate: Node d0af5be7-023a-21c5-586c-c90bce328081
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.281731 [INFO] serf: EventMemberUpdate: Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.281823 [INFO] serf: EventMemberUpdate: Node d0af5be7-023a-21c5-586c-c90bce328081.dc1
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.282425 [INFO] serf: EventMemberUpdate: Node 4ff02853-3203-cef1-3fdc-c5cea0f0a6e7.dc1
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.283680 [INFO] serf: EventMemberUpdate: Node d0af5be7-023a-21c5-586c-c90bce328081
TestAuthMethodUpdateCommand - 2019/12/06 06:03:38.284385 [INFO] serf: EventMemberUpdate: Node d0af5be7-023a-21c5-586c-c90bce328081.dc1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.314787 [ERR] agent: failed to sync remote state: ACL not found
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:38.379363 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:38.379489 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:38.380494 [INFO] serf: EventMemberUpdate: Node fcbd233f-d69b-b621-92eb-3f39066375f4
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:38.381200 [INFO] serf: EventMemberUpdate: Node fcbd233f-d69b-b621-92eb-3f39066375f4.dc1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.570966 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.571983 [INFO] serf: EventMemberUpdate: Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.572705 [INFO] serf: EventMemberUpdate: Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492.dc1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.574972 [INFO] consul: Created ACL anonymous token from configuration
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.575055 [DEBUG] acl: transitioning out of legacy ACL mode
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.576088 [INFO] serf: EventMemberUpdate: Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:38.580780 [INFO] serf: EventMemberUpdate: Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492.dc1
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.678054 [INFO] agent: Synced node info
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.678220 [DEBUG] agent: Node info in sync
=== RUN   TestAuthMethodUpdateCommand_noMerge/update_without_name
=== RUN   TestAuthMethodUpdateCommand_noMerge/update_nonexistent_method
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:38.712080 [DEBUG] http: Request GET /v1/acl/auth-method/test (3.702419ms) from=127.0.0.1:57616
=== RUN   TestAuthMethodUpdateCommand_noMerge/update_all_fields
TestAuthMethodUpdateCommand - 2019/12/06 06:03:39.389918 [INFO] agent: Synced node info
TestAuthMethodUpdateCommand - 2019/12/06 06:03:39.390136 [DEBUG] agent: Node info in sync
=== RUN   TestAuthMethodUpdateCommand/update_without_name
=== RUN   TestAuthMethodUpdateCommand/update_nonexistent_method
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:39.421466 [DEBUG] http: Request PUT /v1/acl/auth-method (705.782349ms) from=127.0.0.1:57618
TestAuthMethodUpdateCommand - 2019/12/06 06:03:39.445595 [DEBUG] http: Request GET /v1/acl/auth-method/test (4.89278ms) from=127.0.0.1:48828
=== RUN   TestAuthMethodUpdateCommand/update_all_fields
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:39.472849 [DEBUG] http: Request GET /v1/acl/auth-method/test-1af1de3a-f72f-d185-e376-05aaf47c8c05 (3.860089ms) from=127.0.0.1:57624
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:39.870095 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:39.870700 [DEBUG] consul: Skipping self join check for "Node fcbd233f-d69b-b621-92eb-3f39066375f4" since the cluster is too small
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:39.870829 [INFO] consul: member 'Node fcbd233f-d69b-b621-92eb-3f39066375f4' joined, marking health alive
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:39.997687 [DEBUG] http: Request PUT /v1/acl/auth-method/test-1af1de3a-f72f-d185-e376-05aaf47c8c05 (520.502724ms) from=127.0.0.1:57624
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.001799 [DEBUG] http: Request GET /v1/acl/auth-method/test-1af1de3a-f72f-d185-e376-05aaf47c8c05 (1.091692ms) from=127.0.0.1:57618
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.005203 [INFO] agent: Requesting shutdown
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.005341 [INFO] consul: shutting down server
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.005398 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.085011 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:40.092540 [DEBUG] consul: Skipping self join check for "Node fcbd233f-d69b-b621-92eb-3f39066375f4" since the cluster is too small
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:40.092999 [DEBUG] consul: Skipping self join check for "Node fcbd233f-d69b-b621-92eb-3f39066375f4" since the cluster is too small
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.268500 [INFO] manager: shutting down
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.278237 [DEBUG] http: Request PUT /v1/acl/auth-method (825.810129ms) from=127.0.0.1:48830
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.287360 [DEBUG] http: Request GET /v1/acl/auth-method/test-3b864c46-c2a4-48a0-cda1-9489ab6d8200 (1.764707ms) from=127.0.0.1:48836
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.377020 [INFO] agent: consul server down
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.377118 [INFO] agent: shutdown complete
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.377193 [INFO] agent: Stopping DNS server 127.0.0.1:53507 (tcp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.377380 [INFO] agent: Stopping DNS server 127.0.0.1:53507 (udp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.377573 [INFO] agent: Stopping HTTP server 127.0.0.1:53508 (tcp)
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.378130 [ERR] connect: Apply failed leadership lost while committing log
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.378202 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.378506 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.379392 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodUpdateCommand_noMerge - 2019/12/06 06:03:40.379649 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodUpdateCommand_noMerge (6.90s)
    --- PASS: TestAuthMethodUpdateCommand_noMerge/update_without_name (0.00s)
    --- PASS: TestAuthMethodUpdateCommand_noMerge/update_nonexistent_method (0.01s)
    --- PASS: TestAuthMethodUpdateCommand_noMerge/update_all_fields (1.29s)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:40.384370 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:40.384860 [DEBUG] consul: Skipping self join check for "Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492" since the cluster is too small
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:40.384970 [INFO] consul: member 'Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492' joined, marking health alive
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:40.387529 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:40.393817 [DEBUG] http: Request PUT /v1/acl/auth-method (257.5943ms) from=127.0.0.1:50932
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:40.403979 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-bc906bcb-70e1-6928-3d2d-b9b72b34c3a9 (1.444033ms) from=127.0.0.1:50936
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.630731 [DEBUG] http: Request PUT /v1/acl/auth-method/test-3b864c46-c2a4-48a0-cda1-9489ab6d8200 (338.838183ms) from=127.0.0.1:48836
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.631000 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.631566 [DEBUG] consul: Skipping self join check for "Node d0af5be7-023a-21c5-586c-c90bce328081" since the cluster is too small
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.631739 [INFO] consul: member 'Node d0af5be7-023a-21c5-586c-c90bce328081' joined, marking health alive
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:40.632955 [DEBUG] consul: Skipping self join check for "Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492" since the cluster is too small
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:40.633561 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-bc906bcb-70e1-6928-3d2d-b9b72b34c3a9 (223.372174ms) from=127.0.0.1:50936
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:40.633730 [DEBUG] consul: Skipping self join check for "Node d9c294a3-e60d-0bd1-fbbe-26c9f4794492" since the cluster is too small
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.636844 [DEBUG] http: Request GET /v1/acl/auth-method/test-3b864c46-c2a4-48a0-cda1-9489ab6d8200 (1.281363ms) from=127.0.0.1:48830
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.647841 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.665117 [INFO] agent: Requesting shutdown
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.665238 [INFO] consul: shutting down server
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.665321 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:40.673297 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-bc906bcb-70e1-6928-3d2d-b9b72b34c3a9 (2.118049ms) from=127.0.0.1:50932
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.676065 [WARN] consul: error getting server health from "Node d0af5be7-023a-21c5-586c-c90bce328081": rpc error making call: EOF
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields_with_cert_file
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_host
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.744922 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.846125 [INFO] manager: shutting down
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.846706 [INFO] agent: consul server down
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.846763 [INFO] agent: shutdown complete
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.846832 [INFO] agent: Stopping DNS server 127.0.0.1:53513 (tcp)
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.846972 [INFO] agent: Stopping DNS server 127.0.0.1:53513 (udp)
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.847128 [INFO] agent: Stopping HTTP server 127.0.0.1:53514 (tcp)
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.848089 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.848523 [ERR] consul: failed to reconcile member: {Node d0af5be7-023a-21c5-586c-c90bce328081 127.0.0.1 53516 map[acls:1 bootstrap:1 build:1.5.2: dc:dc1 id:d0af5be7-023a-21c5-586c-c90bce328081 port:53518 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:53517] alive 1 5 2 2 5 4}: leadership lost while committing log
TestAuthMethodUpdateCommand - 2019/12/06 06:03:40.848708 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodUpdateCommand (7.37s)
    --- PASS: TestAuthMethodUpdateCommand/update_without_name (0.01s)
    --- PASS: TestAuthMethodUpdateCommand/update_nonexistent_method (0.02s)
    --- PASS: TestAuthMethodUpdateCommand/update_all_fields (1.22s)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:41.171019 [DEBUG] http: Request PUT /v1/acl/auth-method (465.553784ms) from=127.0.0.1:37776
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:41.171965 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:41.180674 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-1e9bf9fd-ba3e-e5d5-c102-bc7f39797de8 (1.72404ms) from=127.0.0.1:37778
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_ca_cert
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:41.444567 [DEBUG] http: Request PUT /v1/acl/auth-method (751.678746ms) from=127.0.0.1:50932
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:41.459503 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-51aa1bbd-c562-d10a-56d3-13f75b621b41 (2.250718ms) from=127.0.0.1:50942
TestAuthMethodUpdateCommand - 2019/12/06 06:03:41.638008 [WARN] consul: error getting server health from "Node d0af5be7-023a-21c5-586c-c90bce328081": context deadline exceeded
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:42.353742 [DEBUG] http: Request PUT /v1/acl/auth-method (1.163298281s) from=127.0.0.1:37776
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:42.360394 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-51aa1bbd-c562-d10a-56d3-13f75b621b41 (872.922554ms) from=127.0.0.1:50942
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:42.367926 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-51aa1bbd-c562-d10a-56d3-13f75b621b41 (1.404699ms) from=127.0.0.1:50932
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:42.370870 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-e8f7c309-2126-f6f8-07ff-28176134a096 (1.557036ms) from=127.0.0.1:37782
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_jwt
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_host
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:42.703991 [DEBUG] http: Request PUT /v1/acl/auth-method (321.514781ms) from=127.0.0.1:50932
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:42.716933 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-e3273520-6759-fe58-8fff-f984a9d3ced7 (1.247362ms) from=127.0.0.1:50946
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:42.820496 [DEBUG] http: Request PUT /v1/acl/auth-method (444.377294ms) from=127.0.0.1:37776
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:42.831295 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-1c3936e1-31be-2979-82dd-dff5e9b75bc3 (1.548369ms) from=127.0.0.1:37786
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_all_fields
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:43.006708 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-e3273520-6759-fe58-8fff-f984a9d3ced7 (285.306276ms) from=127.0.0.1:50946
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:43.012355 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-e3273520-6759-fe58-8fff-f984a9d3ced7 (973.356µs) from=127.0.0.1:50932
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_ca_cert
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:43.112168 [DEBUG] http: Request PUT /v1/acl/auth-method (275.065371ms) from=127.0.0.1:37776
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:43.123868 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-418a1570-d93c-756b-2e60-12a152f94a56 (1.182694ms) from=127.0.0.1:37788
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:43.213780 [DEBUG] http: Request PUT /v1/acl/auth-method (196.58522ms) from=127.0.0.1:50932
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:43.226841 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-e184894c-118f-cd03-da8a-c6f121c81e69 (1.721373ms) from=127.0.0.1:50952
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:43.380244 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-418a1570-d93c-756b-2e60-12a152f94a56 (252.194509ms) from=127.0.0.1:37788
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:43.387595 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-418a1570-d93c-756b-2e60-12a152f94a56 (1.903044ms) from=127.0.0.1:37776
=== RUN   TestAuthMethodUpdateCommand_k8s_noMerge/update_all_fields_with_cert_file
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:43.572842 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-e184894c-118f-cd03-da8a-c6f121c81e69 (340.548222ms) from=127.0.0.1:50952
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:43.583553 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-e184894c-118f-cd03-da8a-c6f121c81e69 (1.543369ms) from=127.0.0.1:50932
=== RUN   TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_jwt
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:43.695684 [DEBUG] http: Request PUT /v1/acl/auth-method (300.307956ms) from=127.0.0.1:37776
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:43.707255 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-9beeaf2e-b592-6436-ab00-ec2682ea70cd (1.451367ms) from=127.0.0.1:37792
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.265091 [DEBUG] http: Request PUT /v1/acl/auth-method (676.009659ms) from=127.0.0.1:50932
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.276271 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-62ceb9cb-1a49-897b-95a0-920786879435 (1.151026ms) from=127.0.0.1:50956
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.445312 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-9beeaf2e-b592-6436-ab00-ec2682ea70cd (733.262653ms) from=127.0.0.1:37792
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.459336 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-9beeaf2e-b592-6436-ab00-ec2682ea70cd (3.750421ms) from=127.0.0.1:37776
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.464510 [INFO] agent: Requesting shutdown
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.464598 [INFO] consul: shutting down server
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.464647 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.587183 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.607540 [DEBUG] http: Request PUT /v1/acl/auth-method/k8s-62ceb9cb-1a49-897b-95a0-920786879435 (326.688567ms) from=127.0.0.1:50956
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.624406 [DEBUG] http: Request GET /v1/acl/auth-method/k8s-62ceb9cb-1a49-897b-95a0-920786879435 (3.410412ms) from=127.0.0.1:50932
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.632045 [INFO] agent: Requesting shutdown
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.632148 [INFO] consul: shutting down server
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.632198 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.693905 [INFO] manager: shutting down
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.694930 [INFO] agent: consul server down
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.694997 [INFO] agent: shutdown complete
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.695059 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (tcp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.695228 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (udp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.695395 [INFO] agent: Stopping HTTP server 127.0.0.1:53502 (tcp)
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.696867 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodUpdateCommand_k8s_noMerge - 2019/12/06 06:03:44.697236 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodUpdateCommand_k8s_noMerge (11.22s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_host (0.49s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_ca_cert (1.19s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_missing_k8s_jwt (0.46s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_all_fields (0.56s)
    --- PASS: TestAuthMethodUpdateCommand_k8s_noMerge/update_all_fields_with_cert_file (1.07s)
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.812914 [WARN] serf: Shutdown without a Leave
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.928566 [INFO] manager: shutting down
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.929305 [INFO] agent: consul server down
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.929359 [INFO] agent: shutdown complete
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.929410 [INFO] agent: Stopping DNS server 127.0.0.1:53519 (tcp)
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.929654 [INFO] agent: Stopping DNS server 127.0.0.1:53519 (udp)
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.929993 [INFO] agent: Stopping HTTP server 127.0.0.1:53520 (tcp)
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.931574 [INFO] agent: Waiting for endpoints to shut down
TestAuthMethodUpdateCommand_k8s - 2019/12/06 06:03:44.931750 [INFO] agent: Endpoints down
--- PASS: TestAuthMethodUpdateCommand_k8s (11.45s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields (0.54s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields_with_cert_file (1.69s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_host (0.64s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_ca_cert (0.57s)
    --- PASS: TestAuthMethodUpdateCommand_k8s/update_all_fields_but_k8s_jwt (1.04s)
PASS
ok  	github.com/hashicorp/consul/command/acl/authmethod/update	11.808s
?   	github.com/hashicorp/consul/command/acl/bindingrule	[no test files]
=== RUN   TestBindingRuleCreateCommand_noTabs
=== PAUSE TestBindingRuleCreateCommand_noTabs
=== RUN   TestBindingRuleCreateCommand
=== PAUSE TestBindingRuleCreateCommand
=== CONT  TestBindingRuleCreateCommand_noTabs
=== CONT  TestBindingRuleCreateCommand
--- PASS: TestBindingRuleCreateCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleCreateCommand - 2019/12/06 06:03:48.966656 [WARN] agent: Node name "Node 8277a70e-5c64-04e4-dda7-11bedf7fd571" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleCreateCommand - 2019/12/06 06:03:48.967528 [DEBUG] tlsutil: Update with version 1
TestBindingRuleCreateCommand - 2019/12/06 06:03:48.974400 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8277a70e-5c64-04e4-dda7-11bedf7fd571 Address:127.0.0.1:47506}]
2019/12/06 06:03:50 [INFO]  raft: Node at 127.0.0.1:47506 [Follower] entering Follower state (Leader: "")
2019/12/06 06:03:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:50 [INFO]  raft: Node at 127.0.0.1:47506 [Candidate] entering Candidate state in term 2
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.693590 [WARN] raft: Unable to get address for server id 8277a70e-5c64-04e4-dda7-11bedf7fd571, using fallback address 127.0.0.1:47506: Could not find address for server id 8277a70e-5c64-04e4-dda7-11bedf7fd571
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.712666 [INFO] serf: EventMemberJoin: Node 8277a70e-5c64-04e4-dda7-11bedf7fd571.dc1 127.0.0.1
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.716340 [INFO] serf: EventMemberJoin: Node 8277a70e-5c64-04e4-dda7-11bedf7fd571 127.0.0.1
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.718402 [INFO] agent: Started DNS server 127.0.0.1:47501 (udp)
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.718930 [INFO] consul: Adding LAN server Node 8277a70e-5c64-04e4-dda7-11bedf7fd571 (Addr: tcp/127.0.0.1:47506) (DC: dc1)
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.719856 [INFO] agent: Started DNS server 127.0.0.1:47501 (tcp)
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.720069 [INFO] consul: Handled member-join event for server "Node 8277a70e-5c64-04e4-dda7-11bedf7fd571.dc1" in area "wan"
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.722792 [INFO] agent: Started HTTP server on 127.0.0.1:47502 (tcp)
TestBindingRuleCreateCommand - 2019/12/06 06:03:50.722919 [INFO] agent: started state syncer
2019/12/06 06:03:51 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:51 [INFO]  raft: Node at 127.0.0.1:47506 [Leader] entering Leader state
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.186913 [INFO] consul: cluster leadership acquired
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.187555 [INFO] consul: New leader elected: Node 8277a70e-5c64-04e4-dda7-11bedf7fd571
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.348621 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.470674 [INFO] acl: initializing acls
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.720315 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.720396 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.721525 [INFO] acl: initializing acls
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.721643 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.897293 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleCreateCommand - 2019/12/06 06:03:51.899334 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleCreateCommand - 2019/12/06 06:03:52.280512 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleCreateCommand - 2019/12/06 06:03:52.369665 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleCreateCommand - 2019/12/06 06:03:52.369809 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleCreateCommand - 2019/12/06 06:03:52.370446 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleCreateCommand - 2019/12/06 06:03:52.370541 [INFO] serf: EventMemberUpdate: Node 8277a70e-5c64-04e4-dda7-11bedf7fd571
TestBindingRuleCreateCommand - 2019/12/06 06:03:52.371252 [INFO] serf: EventMemberUpdate: Node 8277a70e-5c64-04e4-dda7-11bedf7fd571.dc1
TestBindingRuleCreateCommand - 2019/12/06 06:03:52.372131 [INFO] serf: EventMemberUpdate: Node 8277a70e-5c64-04e4-dda7-11bedf7fd571
TestBindingRuleCreateCommand - 2019/12/06 06:03:52.373005 [INFO] serf: EventMemberUpdate: Node 8277a70e-5c64-04e4-dda7-11bedf7fd571.dc1
TestBindingRuleCreateCommand - 2019/12/06 06:03:53.607733 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleCreateCommand - 2019/12/06 06:03:53.608338 [DEBUG] consul: Skipping self join check for "Node 8277a70e-5c64-04e4-dda7-11bedf7fd571" since the cluster is too small
TestBindingRuleCreateCommand - 2019/12/06 06:03:53.608452 [INFO] consul: member 'Node 8277a70e-5c64-04e4-dda7-11bedf7fd571' joined, marking health alive
TestBindingRuleCreateCommand - 2019/12/06 06:03:53.888038 [DEBUG] consul: Skipping self join check for "Node 8277a70e-5c64-04e4-dda7-11bedf7fd571" since the cluster is too small
TestBindingRuleCreateCommand - 2019/12/06 06:03:53.888630 [DEBUG] consul: Skipping self join check for "Node 8277a70e-5c64-04e4-dda7-11bedf7fd571" since the cluster is too small
TestBindingRuleCreateCommand - 2019/12/06 06:03:54.204309 [DEBUG] http: Request PUT /v1/acl/auth-method (291.321416ms) from=127.0.0.1:54858
=== RUN   TestBindingRuleCreateCommand/method_is_required
=== RUN   TestBindingRuleCreateCommand/bind_type_required
=== RUN   TestBindingRuleCreateCommand/bind_name_required
=== RUN   TestBindingRuleCreateCommand/must_use_roughly_valid_selector
TestBindingRuleCreateCommand - 2019/12/06 06:03:54.222016 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleCreateCommand - 2019/12/06 06:03:54.224627 [ERR] http: Request PUT /v1/acl/binding-rule, error: invalid Binding Rule: Selector is invalid: 1:4 (3): no match found, expected: "!=", ".", "==", "[", [ \t\r\n] or [a-zA-Z0-9_] from=127.0.0.1:54860
TestBindingRuleCreateCommand - 2019/12/06 06:03:54.238259 [DEBUG] http: Request PUT /v1/acl/binding-rule (17.392403ms) from=127.0.0.1:54860
=== RUN   TestBindingRuleCreateCommand/create_it_with_no_selector
TestBindingRuleCreateCommand - 2019/12/06 06:03:55.810677 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleCreateCommand - 2019/12/06 06:03:55.812504 [DEBUG] http: Request PUT /v1/acl/binding-rule (1.546648828s) from=127.0.0.1:54862
=== RUN   TestBindingRuleCreateCommand/create_it_with_a_match_selector
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.285268 [DEBUG] http: Request PUT /v1/acl/binding-rule (464.855435ms) from=127.0.0.1:54864
=== RUN   TestBindingRuleCreateCommand/create_it_with_type_role
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.620985 [DEBUG] http: Request PUT /v1/acl/binding-rule (326.761237ms) from=127.0.0.1:54866
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.626396 [INFO] agent: Requesting shutdown
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.626498 [INFO] consul: shutting down server
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.626564 [WARN] serf: Shutdown without a Leave
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.718577 [WARN] serf: Shutdown without a Leave
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.818905 [INFO] manager: shutting down
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.819791 [INFO] agent: consul server down
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.819867 [INFO] agent: shutdown complete
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.819928 [INFO] agent: Stopping DNS server 127.0.0.1:47501 (tcp)
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.820091 [INFO] agent: Stopping DNS server 127.0.0.1:47501 (udp)
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.820260 [INFO] agent: Stopping HTTP server 127.0.0.1:47502 (tcp)
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.821416 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleCreateCommand - 2019/12/06 06:03:56.823006 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleCreateCommand (7.94s)
    --- PASS: TestBindingRuleCreateCommand/method_is_required (0.00s)
    --- PASS: TestBindingRuleCreateCommand/bind_type_required (0.00s)
    --- PASS: TestBindingRuleCreateCommand/bind_name_required (0.00s)
    --- PASS: TestBindingRuleCreateCommand/must_use_roughly_valid_selector (0.03s)
    --- PASS: TestBindingRuleCreateCommand/create_it_with_no_selector (1.57s)
    --- PASS: TestBindingRuleCreateCommand/create_it_with_a_match_selector (0.47s)
    --- PASS: TestBindingRuleCreateCommand/create_it_with_type_role (0.34s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/create	8.201s
=== RUN   TestBindingRuleDeleteCommand_noTabs
=== PAUSE TestBindingRuleDeleteCommand_noTabs
=== RUN   TestBindingRuleDeleteCommand
=== PAUSE TestBindingRuleDeleteCommand
=== CONT  TestBindingRuleDeleteCommand_noTabs
=== CONT  TestBindingRuleDeleteCommand
--- PASS: TestBindingRuleDeleteCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleDeleteCommand - 2019/12/06 06:03:55.603953 [WARN] agent: Node name "Node d8ae0547-eb5c-023f-284d-ece431507eb9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleDeleteCommand - 2019/12/06 06:03:55.604734 [DEBUG] tlsutil: Update with version 1
TestBindingRuleDeleteCommand - 2019/12/06 06:03:55.626148 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:03:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d8ae0547-eb5c-023f-284d-ece431507eb9 Address:127.0.0.1:46006}]
2019/12/06 06:03:57 [INFO]  raft: Node at 127.0.0.1:46006 [Follower] entering Follower state (Leader: "")
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.154254 [INFO] serf: EventMemberJoin: Node d8ae0547-eb5c-023f-284d-ece431507eb9.dc1 127.0.0.1
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.159089 [INFO] serf: EventMemberJoin: Node d8ae0547-eb5c-023f-284d-ece431507eb9 127.0.0.1
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.161929 [INFO] consul: Adding LAN server Node d8ae0547-eb5c-023f-284d-ece431507eb9 (Addr: tcp/127.0.0.1:46006) (DC: dc1)
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.162180 [INFO] agent: Started DNS server 127.0.0.1:46001 (udp)
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.162232 [INFO] consul: Handled member-join event for server "Node d8ae0547-eb5c-023f-284d-ece431507eb9.dc1" in area "wan"
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.163008 [INFO] agent: Started DNS server 127.0.0.1:46001 (tcp)
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.166135 [INFO] agent: Started HTTP server on 127.0.0.1:46002 (tcp)
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.166330 [INFO] agent: started state syncer
2019/12/06 06:03:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:03:57 [INFO]  raft: Node at 127.0.0.1:46006 [Candidate] entering Candidate state in term 2
2019/12/06 06:03:57 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:03:57 [INFO]  raft: Node at 127.0.0.1:46006 [Leader] entering Leader state
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.745180 [INFO] consul: cluster leadership acquired
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.745875 [INFO] consul: New leader elected: Node d8ae0547-eb5c-023f-284d-ece431507eb9
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.897299 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleDeleteCommand - 2019/12/06 06:03:57.913673 [INFO] acl: initializing acls
TestBindingRuleDeleteCommand - 2019/12/06 06:03:58.267280 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleDeleteCommand - 2019/12/06 06:03:58.423688 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleDeleteCommand - 2019/12/06 06:03:58.423777 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleDeleteCommand - 2019/12/06 06:03:58.425850 [INFO] acl: initializing acls
TestBindingRuleDeleteCommand - 2019/12/06 06:03:58.425995 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.203742 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.203821 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.455392 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.456385 [INFO] serf: EventMemberUpdate: Node d8ae0547-eb5c-023f-284d-ece431507eb9
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.457168 [INFO] serf: EventMemberUpdate: Node d8ae0547-eb5c-023f-284d-ece431507eb9.dc1
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.813077 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.813163 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.816136 [INFO] serf: EventMemberUpdate: Node d8ae0547-eb5c-023f-284d-ece431507eb9
TestBindingRuleDeleteCommand - 2019/12/06 06:03:59.816950 [INFO] serf: EventMemberUpdate: Node d8ae0547-eb5c-023f-284d-ece431507eb9.dc1
TestBindingRuleDeleteCommand - 2019/12/06 06:04:01.194612 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleDeleteCommand - 2019/12/06 06:04:01.195225 [DEBUG] consul: Skipping self join check for "Node d8ae0547-eb5c-023f-284d-ece431507eb9" since the cluster is too small
TestBindingRuleDeleteCommand - 2019/12/06 06:04:01.195344 [INFO] consul: member 'Node d8ae0547-eb5c-023f-284d-ece431507eb9' joined, marking health alive
TestBindingRuleDeleteCommand - 2019/12/06 06:04:01.439692 [DEBUG] consul: Skipping self join check for "Node d8ae0547-eb5c-023f-284d-ece431507eb9" since the cluster is too small
TestBindingRuleDeleteCommand - 2019/12/06 06:04:01.440476 [DEBUG] consul: Skipping self join check for "Node d8ae0547-eb5c-023f-284d-ece431507eb9" since the cluster is too small
TestBindingRuleDeleteCommand - 2019/12/06 06:04:01.805627 [DEBUG] http: Request PUT /v1/acl/auth-method (362.188057ms) from=127.0.0.1:55418
=== RUN   TestBindingRuleDeleteCommand/id_required
=== RUN   TestBindingRuleDeleteCommand/delete_works
TestBindingRuleDeleteCommand - 2019/12/06 06:04:01.812898 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleDeleteCommand - 2019/12/06 06:04:02.187170 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleDeleteCommand - 2019/12/06 06:04:02.193609 [DEBUG] http: Request PUT /v1/acl/binding-rule (381.889847ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:02.585404 [DEBUG] http: Request DELETE /v1/acl/binding-rule/036ebbe0-d964-e022-0ebf-d1a27ac6f16e (364.020766ms) from=127.0.0.1:55420
TestBindingRuleDeleteCommand - 2019/12/06 06:04:02.588745 [DEBUG] http: Request GET /v1/acl/binding-rule/036ebbe0-d964-e022-0ebf-d1a27ac6f16e (525.346µs) from=127.0.0.1:55418
=== RUN   TestBindingRuleDeleteCommand/delete_works_via_prefixes
TestBindingRuleDeleteCommand - 2019/12/06 06:04:02.929087 [DEBUG] http: Request PUT /v1/acl/binding-rule (337.670489ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:02.940594 [DEBUG] http: Request GET /v1/acl/binding-rules (2.745063ms) from=127.0.0.1:55422
TestBindingRuleDeleteCommand - 2019/12/06 06:04:03.213519 [DEBUG] http: Request DELETE /v1/acl/binding-rule/b37bdbdb-341e-b88f-7362-9b8c39958eb6 (268.289882ms) from=127.0.0.1:55422
TestBindingRuleDeleteCommand - 2019/12/06 06:04:03.222794 [DEBUG] http: Request GET /v1/acl/binding-rule/b37bdbdb-341e-b88f-7362-9b8c39958eb6 (604.014µs) from=127.0.0.1:55418
=== RUN   TestBindingRuleDeleteCommand/delete_fails_when_prefix_matches_more_than_one_rule
TestBindingRuleDeleteCommand - 2019/12/06 06:04:03.232943 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.941045ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:03.603289 [DEBUG] http: Request PUT /v1/acl/binding-rule (359.155987ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:03.607237 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.109025ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.005638 [DEBUG] http: Request PUT /v1/acl/binding-rule (395.470495ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.009923 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.524035ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.338710 [DEBUG] http: Request PUT /v1/acl/binding-rule (325.470206ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.343814 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.299697ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.671366 [DEBUG] http: Request PUT /v1/acl/binding-rule (324.339847ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.675763 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.230362ms) from=127.0.0.1:55418
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.684819 [DEBUG] http: Request GET /v1/acl/binding-rules (1.922711ms) from=127.0.0.1:55424
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.687514 [INFO] agent: Requesting shutdown
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.687845 [INFO] consul: shutting down server
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.688117 [WARN] serf: Shutdown without a Leave
TestBindingRuleDeleteCommand - 2019/12/06 06:04:04.920921 [WARN] serf: Shutdown without a Leave
TestBindingRuleDeleteCommand - 2019/12/06 06:04:05.029724 [INFO] manager: shutting down
TestBindingRuleDeleteCommand - 2019/12/06 06:04:05.030496 [INFO] agent: consul server down
TestBindingRuleDeleteCommand - 2019/12/06 06:04:05.030556 [INFO] agent: shutdown complete
TestBindingRuleDeleteCommand - 2019/12/06 06:04:05.030609 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (tcp)
TestBindingRuleDeleteCommand - 2019/12/06 06:04:05.030748 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (udp)
TestBindingRuleDeleteCommand - 2019/12/06 06:04:05.030900 [INFO] agent: Stopping HTTP server 127.0.0.1:46002 (tcp)
TestBindingRuleDeleteCommand - 2019/12/06 06:04:05.031645 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleDeleteCommand - 2019/12/06 06:04:05.031851 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleDeleteCommand (9.50s)
    --- PASS: TestBindingRuleDeleteCommand/id_required (0.00s)
    --- PASS: TestBindingRuleDeleteCommand/delete_works (0.78s)
    --- PASS: TestBindingRuleDeleteCommand/delete_works_via_prefixes (0.63s)
    --- PASS: TestBindingRuleDeleteCommand/delete_fails_when_prefix_matches_more_than_one_rule (1.46s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/delete	9.761s
=== RUN   TestBindingRuleListCommand_noTabs
=== PAUSE TestBindingRuleListCommand_noTabs
=== RUN   TestBindingRuleListCommand
=== PAUSE TestBindingRuleListCommand
=== CONT  TestBindingRuleListCommand
=== CONT  TestBindingRuleListCommand_noTabs
--- PASS: TestBindingRuleListCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleListCommand - 2019/12/06 06:04:31.067415 [WARN] agent: Node name "Node 11b92520-8131-5246-7d52-a95ee8d94c8c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleListCommand - 2019/12/06 06:04:31.068198 [DEBUG] tlsutil: Update with version 1
TestBindingRuleListCommand - 2019/12/06 06:04:31.094612 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:11b92520-8131-5246-7d52-a95ee8d94c8c Address:127.0.0.1:14506}]
2019/12/06 06:04:32 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestBindingRuleListCommand - 2019/12/06 06:04:32.566480 [INFO] serf: EventMemberJoin: Node 11b92520-8131-5246-7d52-a95ee8d94c8c.dc1 127.0.0.1
TestBindingRuleListCommand - 2019/12/06 06:04:32.571203 [INFO] serf: EventMemberJoin: Node 11b92520-8131-5246-7d52-a95ee8d94c8c 127.0.0.1
TestBindingRuleListCommand - 2019/12/06 06:04:32.572870 [INFO] consul: Adding LAN server Node 11b92520-8131-5246-7d52-a95ee8d94c8c (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestBindingRuleListCommand - 2019/12/06 06:04:32.573651 [INFO] consul: Handled member-join event for server "Node 11b92520-8131-5246-7d52-a95ee8d94c8c.dc1" in area "wan"
TestBindingRuleListCommand - 2019/12/06 06:04:32.576360 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestBindingRuleListCommand - 2019/12/06 06:04:32.576706 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestBindingRuleListCommand - 2019/12/06 06:04:32.579804 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestBindingRuleListCommand - 2019/12/06 06:04:32.580011 [INFO] agent: started state syncer
2019/12/06 06:04:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:32 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:33 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:33 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestBindingRuleListCommand - 2019/12/06 06:04:33.961367 [INFO] consul: cluster leadership acquired
TestBindingRuleListCommand - 2019/12/06 06:04:33.961979 [INFO] consul: New leader elected: Node 11b92520-8131-5246-7d52-a95ee8d94c8c
TestBindingRuleListCommand - 2019/12/06 06:04:34.031593 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleListCommand - 2019/12/06 06:04:34.126155 [INFO] acl: initializing acls
TestBindingRuleListCommand - 2019/12/06 06:04:34.446525 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleListCommand - 2019/12/06 06:04:34.446634 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleListCommand - 2019/12/06 06:04:34.450315 [INFO] acl: initializing acls
TestBindingRuleListCommand - 2019/12/06 06:04:34.450473 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleListCommand - 2019/12/06 06:04:34.875497 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleListCommand - 2019/12/06 06:04:36.038815 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleListCommand - 2019/12/06 06:04:36.038960 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleListCommand - 2019/12/06 06:04:36.500388 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleListCommand - 2019/12/06 06:04:36.500713 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleListCommand - 2019/12/06 06:04:36.500774 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleListCommand - 2019/12/06 06:04:36.501607 [INFO] serf: EventMemberUpdate: Node 11b92520-8131-5246-7d52-a95ee8d94c8c
TestBindingRuleListCommand - 2019/12/06 06:04:36.502200 [INFO] serf: EventMemberUpdate: Node 11b92520-8131-5246-7d52-a95ee8d94c8c.dc1
TestBindingRuleListCommand - 2019/12/06 06:04:36.503342 [INFO] serf: EventMemberUpdate: Node 11b92520-8131-5246-7d52-a95ee8d94c8c
TestBindingRuleListCommand - 2019/12/06 06:04:36.504165 [INFO] serf: EventMemberUpdate: Node 11b92520-8131-5246-7d52-a95ee8d94c8c.dc1
TestBindingRuleListCommand - 2019/12/06 06:04:38.020387 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleListCommand - 2019/12/06 06:04:38.023767 [DEBUG] consul: Skipping self join check for "Node 11b92520-8131-5246-7d52-a95ee8d94c8c" since the cluster is too small
TestBindingRuleListCommand - 2019/12/06 06:04:38.023915 [INFO] consul: member 'Node 11b92520-8131-5246-7d52-a95ee8d94c8c' joined, marking health alive
TestBindingRuleListCommand - 2019/12/06 06:04:38.321852 [DEBUG] consul: Skipping self join check for "Node 11b92520-8131-5246-7d52-a95ee8d94c8c" since the cluster is too small
TestBindingRuleListCommand - 2019/12/06 06:04:38.322381 [DEBUG] consul: Skipping self join check for "Node 11b92520-8131-5246-7d52-a95ee8d94c8c" since the cluster is too small
TestBindingRuleListCommand - 2019/12/06 06:04:38.705268 [DEBUG] http: Request PUT /v1/acl/auth-method (378.020425ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:38.970059 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleListCommand - 2019/12/06 06:04:38.975691 [DEBUG] http: Request PUT /v1/acl/auth-method (267.030853ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:38.983100 [DEBUG] acl: updating cached auth method validator for "test-1"
TestBindingRuleListCommand - 2019/12/06 06:04:39.296894 [DEBUG] http: Request PUT /v1/acl/binding-rule (314.94363ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:39.301721 [DEBUG] acl: updating cached auth method validator for "test-2"
TestBindingRuleListCommand - 2019/12/06 06:04:39.630321 [DEBUG] http: Request PUT /v1/acl/binding-rule (329.685638ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:40.223833 [DEBUG] http: Request PUT /v1/acl/binding-rule (589.225984ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:40.630699 [DEBUG] http: Request PUT /v1/acl/binding-rule (401.708307ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:40.957531 [DEBUG] http: Request PUT /v1/acl/binding-rule (321.387779ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:41.446383 [DEBUG] http: Request PUT /v1/acl/binding-rule (484.316553ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:41.798741 [DEBUG] http: Request PUT /v1/acl/binding-rule (349.169089ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:42.162874 [DEBUG] http: Request PUT /v1/acl/binding-rule (358.523306ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:42.379458 [DEBUG] http: Request PUT /v1/acl/binding-rule (210.998888ms) from=127.0.0.1:40330
TestBindingRuleListCommand - 2019/12/06 06:04:42.570481 [DEBUG] http: Request PUT /v1/acl/binding-rule (187.934354ms) from=127.0.0.1:40330
=== RUN   TestBindingRuleListCommand/normal
TestBindingRuleListCommand - 2019/12/06 06:04:42.581660 [DEBUG] http: Request GET /v1/acl/binding-rules (2.813399ms) from=127.0.0.1:40332
=== RUN   TestBindingRuleListCommand/filter_by_method_1
TestBindingRuleListCommand - 2019/12/06 06:04:42.593221 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test-1 (2.400722ms) from=127.0.0.1:40334
=== RUN   TestBindingRuleListCommand/filter_by_method_2
TestBindingRuleListCommand - 2019/12/06 06:04:42.606376 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test-2 (1.751041ms) from=127.0.0.1:40336
TestBindingRuleListCommand - 2019/12/06 06:04:42.611176 [INFO] agent: Requesting shutdown
TestBindingRuleListCommand - 2019/12/06 06:04:42.611277 [INFO] consul: shutting down server
TestBindingRuleListCommand - 2019/12/06 06:04:42.614072 [WARN] serf: Shutdown without a Leave
TestBindingRuleListCommand - 2019/12/06 06:04:42.769791 [WARN] serf: Shutdown without a Leave
TestBindingRuleListCommand - 2019/12/06 06:04:42.844443 [INFO] manager: shutting down
TestBindingRuleListCommand - 2019/12/06 06:04:42.847155 [INFO] agent: consul server down
TestBindingRuleListCommand - 2019/12/06 06:04:42.847227 [INFO] agent: shutdown complete
TestBindingRuleListCommand - 2019/12/06 06:04:42.847288 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestBindingRuleListCommand - 2019/12/06 06:04:42.847458 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestBindingRuleListCommand - 2019/12/06 06:04:42.847617 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestBindingRuleListCommand - 2019/12/06 06:04:42.848761 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleListCommand - 2019/12/06 06:04:42.848878 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleListCommand (11.86s)
    --- PASS: TestBindingRuleListCommand/normal (0.01s)
    --- PASS: TestBindingRuleListCommand/filter_by_method_1 (0.01s)
    --- PASS: TestBindingRuleListCommand/filter_by_method_2 (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/list	12.131s
=== RUN   TestBindingRuleReadCommand_noTabs
=== PAUSE TestBindingRuleReadCommand_noTabs
=== RUN   TestBindingRuleReadCommand
=== PAUSE TestBindingRuleReadCommand
=== CONT  TestBindingRuleReadCommand_noTabs
=== CONT  TestBindingRuleReadCommand
--- PASS: TestBindingRuleReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleReadCommand - 2019/12/06 06:04:42.055772 [WARN] agent: Node name "Node 598adfe2-1684-462b-726f-13ebecdd5f95" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleReadCommand - 2019/12/06 06:04:42.058289 [DEBUG] tlsutil: Update with version 1
TestBindingRuleReadCommand - 2019/12/06 06:04:42.074253 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:598adfe2-1684-462b-726f-13ebecdd5f95 Address:127.0.0.1:20506}]
2019/12/06 06:04:43 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestBindingRuleReadCommand - 2019/12/06 06:04:43.168648 [INFO] serf: EventMemberJoin: Node 598adfe2-1684-462b-726f-13ebecdd5f95.dc1 127.0.0.1
TestBindingRuleReadCommand - 2019/12/06 06:04:43.172836 [INFO] serf: EventMemberJoin: Node 598adfe2-1684-462b-726f-13ebecdd5f95 127.0.0.1
TestBindingRuleReadCommand - 2019/12/06 06:04:43.173561 [INFO] consul: Adding LAN server Node 598adfe2-1684-462b-726f-13ebecdd5f95 (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestBindingRuleReadCommand - 2019/12/06 06:04:43.174593 [INFO] consul: Handled member-join event for server "Node 598adfe2-1684-462b-726f-13ebecdd5f95.dc1" in area "wan"
TestBindingRuleReadCommand - 2019/12/06 06:04:43.176160 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestBindingRuleReadCommand - 2019/12/06 06:04:43.176425 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestBindingRuleReadCommand - 2019/12/06 06:04:43.180936 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestBindingRuleReadCommand - 2019/12/06 06:04:43.181116 [INFO] agent: started state syncer
2019/12/06 06:04:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:43 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:44 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:44 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestBindingRuleReadCommand - 2019/12/06 06:04:44.939135 [INFO] consul: cluster leadership acquired
TestBindingRuleReadCommand - 2019/12/06 06:04:44.939882 [INFO] consul: New leader elected: Node 598adfe2-1684-462b-726f-13ebecdd5f95
TestBindingRuleReadCommand - 2019/12/06 06:04:45.046037 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleReadCommand - 2019/12/06 06:04:45.344821 [INFO] acl: initializing acls
TestBindingRuleReadCommand - 2019/12/06 06:04:45.538349 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleReadCommand - 2019/12/06 06:04:45.538438 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleReadCommand - 2019/12/06 06:04:45.795410 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleReadCommand - 2019/12/06 06:04:46.070767 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleReadCommand - 2019/12/06 06:04:46.071634 [INFO] serf: EventMemberUpdate: Node 598adfe2-1684-462b-726f-13ebecdd5f95
TestBindingRuleReadCommand - 2019/12/06 06:04:46.072266 [INFO] serf: EventMemberUpdate: Node 598adfe2-1684-462b-726f-13ebecdd5f95.dc1
TestBindingRuleReadCommand - 2019/12/06 06:04:47.380254 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleReadCommand - 2019/12/06 06:04:47.544519 [DEBUG] consul: Skipping self join check for "Node 598adfe2-1684-462b-726f-13ebecdd5f95" since the cluster is too small
TestBindingRuleReadCommand - 2019/12/06 06:04:47.544811 [INFO] consul: member 'Node 598adfe2-1684-462b-726f-13ebecdd5f95' joined, marking health alive
TestBindingRuleReadCommand - 2019/12/06 06:04:47.548095 [INFO] agent: Synced node info
TestBindingRuleReadCommand - 2019/12/06 06:04:47.548233 [DEBUG] agent: Node info in sync
TestBindingRuleReadCommand - 2019/12/06 06:04:48.032680 [DEBUG] http: Request PUT /v1/acl/auth-method (474.112651ms) from=127.0.0.1:47990
TestBindingRuleReadCommand - 2019/12/06 06:04:48.034774 [DEBUG] consul: Skipping self join check for "Node 598adfe2-1684-462b-726f-13ebecdd5f95" since the cluster is too small
=== RUN   TestBindingRuleReadCommand/id_required
=== RUN   TestBindingRuleReadCommand/read_by_id_not_found
TestBindingRuleReadCommand - 2019/12/06 06:04:48.043482 [DEBUG] http: Request GET /v1/acl/binding-rule/7be85858-13b2-b4d6-2879-b9e271b222a6 (675.016µs) from=127.0.0.1:47992
=== RUN   TestBindingRuleReadCommand/read_by_id
TestBindingRuleReadCommand - 2019/12/06 06:04:48.047701 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleReadCommand - 2019/12/06 06:04:48.296885 [DEBUG] http: Request PUT /v1/acl/binding-rule (250.177463ms) from=127.0.0.1:47990
TestBindingRuleReadCommand - 2019/12/06 06:04:48.297194 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleReadCommand - 2019/12/06 06:04:48.313371 [DEBUG] http: Request GET /v1/acl/binding-rule/174da741-51a8-7134-f06c-ec6e48c37109 (1.251029ms) from=127.0.0.1:47994
=== RUN   TestBindingRuleReadCommand/read_by_id_prefix
TestBindingRuleReadCommand - 2019/12/06 06:04:48.554063 [DEBUG] http: Request PUT /v1/acl/binding-rule (227.820945ms) from=127.0.0.1:47990
TestBindingRuleReadCommand - 2019/12/06 06:04:48.563806 [DEBUG] http: Request GET /v1/acl/binding-rules (1.590371ms) from=127.0.0.1:47996
TestBindingRuleReadCommand - 2019/12/06 06:04:48.570206 [DEBUG] http: Request GET /v1/acl/binding-rule/f164b3c2-1c13-ab33-9932-ccf05035bed0 (2.200385ms) from=127.0.0.1:47996
TestBindingRuleReadCommand - 2019/12/06 06:04:48.572495 [INFO] agent: Requesting shutdown
TestBindingRuleReadCommand - 2019/12/06 06:04:48.572585 [INFO] consul: shutting down server
TestBindingRuleReadCommand - 2019/12/06 06:04:48.572631 [WARN] serf: Shutdown without a Leave
TestBindingRuleReadCommand - 2019/12/06 06:04:48.761172 [WARN] serf: Shutdown without a Leave
TestBindingRuleReadCommand - 2019/12/06 06:04:48.836134 [INFO] manager: shutting down
TestBindingRuleReadCommand - 2019/12/06 06:04:48.837104 [INFO] agent: consul server down
TestBindingRuleReadCommand - 2019/12/06 06:04:48.837167 [INFO] agent: shutdown complete
TestBindingRuleReadCommand - 2019/12/06 06:04:48.837249 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestBindingRuleReadCommand - 2019/12/06 06:04:48.837409 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestBindingRuleReadCommand - 2019/12/06 06:04:48.837567 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestBindingRuleReadCommand - 2019/12/06 06:04:48.838449 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleReadCommand - 2019/12/06 06:04:48.838577 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleReadCommand (6.87s)
    --- PASS: TestBindingRuleReadCommand/id_required (0.00s)
    --- PASS: TestBindingRuleReadCommand/read_by_id_not_found (0.01s)
    --- PASS: TestBindingRuleReadCommand/read_by_id (0.28s)
    --- PASS: TestBindingRuleReadCommand/read_by_id_prefix (0.25s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/read	7.140s
=== RUN   TestBindingRuleUpdateCommand_noTabs
=== PAUSE TestBindingRuleUpdateCommand_noTabs
=== RUN   TestBindingRuleUpdateCommand
=== PAUSE TestBindingRuleUpdateCommand
=== RUN   TestBindingRuleUpdateCommand_noMerge
=== PAUSE TestBindingRuleUpdateCommand_noMerge
=== CONT  TestBindingRuleUpdateCommand_noTabs
=== CONT  TestBindingRuleUpdateCommand_noMerge
=== CONT  TestBindingRuleUpdateCommand
--- PASS: TestBindingRuleUpdateCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:51.943925 [WARN] agent: Node name "Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:51.949857 [DEBUG] tlsutil: Update with version 1
TestBindingRuleUpdateCommand - 2019/12/06 06:04:51.952071 [WARN] agent: Node name "Node 58430de2-9568-8dcb-9591-bcb2872cbb26" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBindingRuleUpdateCommand - 2019/12/06 06:04:51.952701 [DEBUG] tlsutil: Update with version 1
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:51.961128 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleUpdateCommand - 2019/12/06 06:04:51.962529 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:04:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:58430de2-9568-8dcb-9591-bcb2872cbb26 Address:127.0.0.1:26512}]
2019/12/06 06:04:52 [INFO]  raft: Node at 127.0.0.1:26512 [Follower] entering Follower state (Leader: "")
2019/12/06 06:04:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bdd0d84e-8270-1e56-c9a0-9c500a6262d7 Address:127.0.0.1:26506}]
TestBindingRuleUpdateCommand - 2019/12/06 06:04:52.817545 [INFO] serf: EventMemberJoin: Node 58430de2-9568-8dcb-9591-bcb2872cbb26.dc1 127.0.0.1
2019/12/06 06:04:52 [INFO]  raft: Node at 127.0.0.1:26506 [Follower] entering Follower state (Leader: "")
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:52.834515 [INFO] serf: EventMemberJoin: Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7.dc1 127.0.0.1
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:52.849027 [INFO] serf: EventMemberJoin: Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7 127.0.0.1
TestBindingRuleUpdateCommand - 2019/12/06 06:04:52.851588 [INFO] serf: EventMemberJoin: Node 58430de2-9568-8dcb-9591-bcb2872cbb26 127.0.0.1
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:52.851654 [INFO] consul: Adding LAN server Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7 (Addr: tcp/127.0.0.1:26506) (DC: dc1)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:52.853468 [INFO] agent: Started DNS server 127.0.0.1:26501 (udp)
TestBindingRuleUpdateCommand - 2019/12/06 06:04:52.853469 [INFO] agent: Started DNS server 127.0.0.1:26507 (udp)
TestBindingRuleUpdateCommand - 2019/12/06 06:04:52.853915 [INFO] consul: Adding LAN server Node 58430de2-9568-8dcb-9591-bcb2872cbb26 (Addr: tcp/127.0.0.1:26512) (DC: dc1)
TestBindingRuleUpdateCommand - 2019/12/06 06:04:52.854272 [INFO] consul: Handled member-join event for server "Node 58430de2-9568-8dcb-9591-bcb2872cbb26.dc1" in area "wan"
TestBindingRuleUpdateCommand - 2019/12/06 06:04:52.854640 [INFO] agent: Started DNS server 127.0.0.1:26507 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:52.855397 [INFO] agent: Started DNS server 127.0.0.1:26501 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:52.856546 [INFO] consul: Handled member-join event for server "Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7.dc1" in area "wan"
TestBindingRuleUpdateCommand - 2019/12/06 06:04:52.857707 [INFO] agent: Started HTTP server on 127.0.0.1:26508 (tcp)
TestBindingRuleUpdateCommand - 2019/12/06 06:04:52.857857 [INFO] agent: started state syncer
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:52.857707 [INFO] agent: Started HTTP server on 127.0.0.1:26502 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:52.859749 [INFO] agent: started state syncer
2019/12/06 06:04:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:52 [INFO]  raft: Node at 127.0.0.1:26512 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:04:52 [INFO]  raft: Node at 127.0.0.1:26506 [Candidate] entering Candidate state in term 2
2019/12/06 06:04:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:53 [INFO]  raft: Node at 127.0.0.1:26512 [Leader] entering Leader state
2019/12/06 06:04:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:04:53 [INFO]  raft: Node at 127.0.0.1:26506 [Leader] entering Leader state
TestBindingRuleUpdateCommand - 2019/12/06 06:04:53.423087 [INFO] consul: cluster leadership acquired
TestBindingRuleUpdateCommand - 2019/12/06 06:04:53.423596 [INFO] consul: New leader elected: Node 58430de2-9568-8dcb-9591-bcb2872cbb26
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:53.423924 [INFO] consul: cluster leadership acquired
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:53.424381 [INFO] consul: New leader elected: Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7
TestBindingRuleUpdateCommand - 2019/12/06 06:04:53.590142 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:53.605517 [INFO] acl: initializing acls
TestBindingRuleUpdateCommand - 2019/12/06 06:04:53.605554 [INFO] acl: initializing acls
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:53.737186 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleUpdateCommand - 2019/12/06 06:04:53.970317 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleUpdateCommand - 2019/12/06 06:04:53.970412 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleUpdateCommand - 2019/12/06 06:04:53.973597 [INFO] acl: initializing acls
TestBindingRuleUpdateCommand - 2019/12/06 06:04:53.973717 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:53.973816 [INFO] consul: Created ACL 'global-management' policy
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:53.973874 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:53.976135 [INFO] acl: initializing acls
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:53.976285 [WARN] consul: Configuring a non-UUID master token is deprecated
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.462745 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.463897 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.465369 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.470718 [INFO] consul: Bootstrapped ACL master token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.871465 [ERR] agent: failed to sync remote state: ACL not found
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.879948 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.880074 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.880958 [INFO] serf: EventMemberUpdate: Node 58430de2-9568-8dcb-9591-bcb2872cbb26
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.881712 [INFO] serf: EventMemberUpdate: Node 58430de2-9568-8dcb-9591-bcb2872cbb26.dc1
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.883632 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.884074 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.884132 [DEBUG] acl: transitioning out of legacy ACL mode
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.884536 [INFO] serf: EventMemberUpdate: Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.881858 [INFO] consul: Created ACL anonymous token from configuration
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.885232 [INFO] serf: EventMemberUpdate: Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7.dc1
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.885941 [INFO] serf: EventMemberUpdate: Node 58430de2-9568-8dcb-9591-bcb2872cbb26
TestBindingRuleUpdateCommand - 2019/12/06 06:04:54.886607 [INFO] serf: EventMemberUpdate: Node 58430de2-9568-8dcb-9591-bcb2872cbb26.dc1
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.890000 [INFO] serf: EventMemberUpdate: Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:54.890835 [INFO] serf: EventMemberUpdate: Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7.dc1
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.112301 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.112812 [DEBUG] consul: Skipping self join check for "Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7" since the cluster is too small
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.112923 [INFO] consul: member 'Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7' joined, marking health alive
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.113539 [INFO] agent: Synced node info
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.113719 [DEBUG] agent: Node info in sync
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.313914 [DEBUG] consul: Skipping self join check for "Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7" since the cluster is too small
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.314856 [DEBUG] consul: Skipping self join check for "Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7" since the cluster is too small
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.315985 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.520607 [DEBUG] consul: Skipping self join check for "Node 58430de2-9568-8dcb-9591-bcb2872cbb26" since the cluster is too small
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.520849 [INFO] consul: member 'Node 58430de2-9568-8dcb-9591-bcb2872cbb26' joined, marking health alive
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.522326 [DEBUG] http: Request PUT /v1/acl/auth-method (395.09682ms) from=127.0.0.1:57070
=== RUN   TestBindingRuleUpdateCommand/rule_id_required
=== RUN   TestBindingRuleUpdateCommand/rule_id_partial_matches_nothing
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.562385 [DEBUG] http: Request GET /v1/acl/binding-rules (2.13805ms) from=127.0.0.1:57074
=== RUN   TestBindingRuleUpdateCommand/rule_id_exact_match_doesn't_exist
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.569392 [DEBUG] http: Request GET /v1/acl/binding-rule/78f5b473-7c93-7473-4c7b-be9dec24f64a (751.684µs) from=127.0.0.1:57076
=== RUN   TestBindingRuleUpdateCommand/rule_id_partial_matches_multiple
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.574066 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.952712ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand - 2019/12/06 06:04:56.577819 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.646381 [DEBUG] http: Request PUT /v1/acl/auth-method (317.005345ms) from=127.0.0.1:48616
=== RUN   TestBindingRuleUpdateCommand_noMerge/rule_id_required
=== RUN   TestBindingRuleUpdateCommand_noMerge/rule_id_partial_matches_nothing
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.706605 [DEBUG] http: Request GET /v1/acl/binding-rules (1.332031ms) from=127.0.0.1:48622
=== RUN   TestBindingRuleUpdateCommand_noMerge/rule_id_exact_match_doesn't_exist
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.719672 [DEBUG] http: Request GET /v1/acl/binding-rule/6d0c1a3c-882c-451f-6c8c-d863dd1a5505 (778.018µs) from=127.0.0.1:48624
=== RUN   TestBindingRuleUpdateCommand_noMerge/rule_id_partial_matches_multiple
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.724992 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.360698ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:56.728879 [DEBUG] acl: updating cached auth method validator for "test"
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.040076 [DEBUG] http: Request PUT /v1/acl/binding-rule (312.163233ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.040786 [DEBUG] consul: Skipping self join check for "Node 58430de2-9568-8dcb-9591-bcb2872cbb26" since the cluster is too small
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.041448 [DEBUG] consul: Skipping self join check for "Node 58430de2-9568-8dcb-9591-bcb2872cbb26" since the cluster is too small
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.047522 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.461034ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.044155 [DEBUG] http: Request PUT /v1/acl/binding-rule (467.495164ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.053255 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.070025ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.215935 [DEBUG] http: Request PUT /v1/acl/binding-rule (156.632296ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.220885 [DEBUG] http: Request PUT /v1/acl/binding-rule (170.137942ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.221868 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.224029ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.225516 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.007023ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.230310 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.455742 [DEBUG] http: Request PUT /v1/acl/binding-rule (231.202689ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.456987 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.460067 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.470034ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.467831 [DEBUG] http: Request PUT /v1/acl/binding-rule (239.497548ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.472157 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.103026ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.638558 [DEBUG] http: Request PUT /v1/acl/binding-rule (173.882362ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.638516 [DEBUG] http: Request PUT /v1/acl/binding-rule (162.633101ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.644839 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.30703ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.658908 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.29803ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.663727 [DEBUG] http: Request GET /v1/acl/binding-rules (1.476367ms) from=127.0.0.1:57082
=== RUN   TestBindingRuleUpdateCommand/must_use_roughly_valid_selector
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.929529 [DEBUG] http: Request PUT /v1/acl/binding-rule (266.514842ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.932548 [DEBUG] http: Request PUT /v1/acl/binding-rule (258.518322ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.942463 [DEBUG] http: Request GET /v1/acl/binding-rule/ccd26afd-0ff7-1f67-5ac9-142f3cff2924 (1.307364ms) from=127.0.0.1:57084
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:57.951836 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (2.422723ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.961802 [ERR] http: Request PUT /v1/acl/binding-rule/ccd26afd-0ff7-1f67-5ac9-142f3cff2924, error: invalid Binding Rule: Selector is invalid: 1:4 (3): no match found, expected: "!=", ".", "==", "[", [ \t\r\n] or [a-zA-Z0-9_] from=127.0.0.1:57084
TestBindingRuleUpdateCommand - 2019/12/06 06:04:57.962600 [DEBUG] http: Request PUT /v1/acl/binding-rule/ccd26afd-0ff7-1f67-5ac9-142f3cff2924 (11.67227ms) from=127.0.0.1:57084
=== RUN   TestBindingRuleUpdateCommand/update_all_fields
TestBindingRuleUpdateCommand - 2019/12/06 06:04:58.171991 [DEBUG] http: Request PUT /v1/acl/binding-rule (205.815101ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.171969 [DEBUG] http: Request PUT /v1/acl/binding-rule (213.781286ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.181690 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (3.818422ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:58.187471 [DEBUG] http: Request GET /v1/acl/binding-rule/9a4582ca-87e1-c6e8-51ca-67f910c681f7 (1.312697ms) from=127.0.0.1:57086
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.345788 [DEBUG] http: Request PUT /v1/acl/binding-rule (159.406694ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.350008 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.434367ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:58.366255 [DEBUG] http: Request PUT /v1/acl/binding-rule/9a4582ca-87e1-c6e8-51ca-67f910c681f7 (174.218037ms) from=127.0.0.1:57086
TestBindingRuleUpdateCommand - 2019/12/06 06:04:58.381121 [DEBUG] http: Request GET /v1/acl/binding-rule/9a4582ca-87e1-c6e8-51ca-67f910c681f7 (11.747273ms) from=127.0.0.1:57070
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_-_partial
TestBindingRuleUpdateCommand - 2019/12/06 06:04:58.386225 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.144693ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand - 2019/12/06 06:04:58.615493 [DEBUG] http: Request DELETE /v1/acl/binding-rule/504e935e-99a0-7434-f6bd-0d8a4e2ac486 (225.626894ms) from=127.0.0.1:57070
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.621568 [DEBUG] http: Request PUT /v1/acl/binding-rule (267.088188ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.628665 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.385699ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.649828 [DEBUG] http: Request GET /v1/acl/binding-rules (1.445367ms) from=127.0.0.1:48634
=== RUN   TestBindingRuleUpdateCommand_noMerge/must_use_roughly_valid_selector
TestBindingRuleUpdateCommand - 2019/12/06 06:04:58.904778 [DEBUG] http: Request DELETE /v1/acl/binding-rule/514d14fa-ad1b-27e5-25fa-a59d7692b532 (277.925439ms) from=127.0.0.1:57088
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.904995 [DEBUG] http: Request PUT /v1/acl/binding-rule (248.604427ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.915761 [DEBUG] http: Request GET /v1/acl/binding-rule/06ea3778-5df7-3dab-2088-a87c477e8c29 (1.226362ms) from=127.0.0.1:48638
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.922322 [ERR] http: Request PUT /v1/acl/binding-rule/06ea3778-5df7-3dab-2088-a87c477e8c29, error: invalid Binding Rule: Selector is invalid: 1:4 (3): no match found, expected: "!=", ".", "==", "[", [ \t\r\n] or [a-zA-Z0-9_] from=127.0.0.1:48638
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:58.922992 [DEBUG] http: Request PUT /v1/acl/binding-rule/06ea3778-5df7-3dab-2088-a87c477e8c29 (3.87409ms) from=127.0.0.1:48638
=== RUN   TestBindingRuleUpdateCommand_noMerge/update_all_fields
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:59.154846 [DEBUG] http: Request PUT /v1/acl/binding-rule (228.750966ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:59.160642 [DEBUG] http: Request DELETE /v1/acl/binding-rule/9a4582ca-87e1-c6e8-51ca-67f910c681f7 (249.227108ms) from=127.0.0.1:57092
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:59.168236 [DEBUG] http: Request GET /v1/acl/binding-rule/d550c4fe-dd1d-4de7-db89-de516df40153 (2.894734ms) from=127.0.0.1:48640
TestBindingRuleUpdateCommand - 2019/12/06 06:04:59.397190 [DEBUG] http: Request DELETE /v1/acl/binding-rule/c947f47c-46e1-c12e-ce9a-97ee5e36a9e6 (230.503007ms) from=127.0.0.1:57098
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:59.405026 [DEBUG] http: Request PUT /v1/acl/binding-rule/d550c4fe-dd1d-4de7-db89-de516df40153 (231.417695ms) from=127.0.0.1:48640
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:59.409473 [DEBUG] http: Request GET /v1/acl/binding-rule/d550c4fe-dd1d-4de7-db89-de516df40153 (1.272029ms) from=127.0.0.1:48616
=== RUN   TestBindingRuleUpdateCommand_noMerge/update_all_fields_-_partial
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:59.413895 [DEBUG] http: Request GET /v1/acl/binding-rules?authmethod=test (1.324364ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:59.637683 [DEBUG] http: Request DELETE /v1/acl/binding-rule/06ea3778-5df7-3dab-2088-a87c477e8c29 (220.00443ms) from=127.0.0.1:48616
TestBindingRuleUpdateCommand - 2019/12/06 06:04:59.639954 [DEBUG] http: Request DELETE /v1/acl/binding-rule/ccd26afd-0ff7-1f67-5ac9-142f3cff2924 (238.735864ms) from=127.0.0.1:57100
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:59.804528 [DEBUG] http: Request DELETE /v1/acl/binding-rule/17774e6b-f53c-9535-dc2b-33bf96a1a348 (160.752058ms) from=127.0.0.1:48646
TestBindingRuleUpdateCommand - 2019/12/06 06:04:59.807619 [DEBUG] http: Request DELETE /v1/acl/binding-rule/d872a620-05c1-33c2-555a-bb443fefcc8a (159.548363ms) from=127.0.0.1:57104
TestBindingRuleUpdateCommand - 2019/12/06 06:04:59.963428 [DEBUG] http: Request PUT /v1/acl/binding-rule (148.681112ms) from=127.0.0.1:57108
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:04:59.963441 [DEBUG] http: Request DELETE /v1/acl/binding-rule/45a14f41-5ea8-0f09-6147-cc37a623abb2 (151.687515ms) from=127.0.0.1:48650
TestBindingRuleUpdateCommand - 2019/12/06 06:04:59.972162 [DEBUG] http: Request GET /v1/acl/binding-rules (1.402699ms) from=127.0.0.1:57112
TestBindingRuleUpdateCommand - 2019/12/06 06:04:59.976524 [DEBUG] http: Request GET /v1/acl/binding-rule/53cf4647-8f2d-7442-8a69-5ac011ab84ad (1.013356ms) from=127.0.0.1:57112
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:00.187932 [DEBUG] http: Request DELETE /v1/acl/binding-rule/63b92744-1a0a-eef7-c0c7-568e91e71dab (220.853451ms) from=127.0.0.1:48654
TestBindingRuleUpdateCommand - 2019/12/06 06:05:00.331201 [DEBUG] http: Request PUT /v1/acl/binding-rule/53cf4647-8f2d-7442-8a69-5ac011ab84ad (351.618813ms) from=127.0.0.1:57112
TestBindingRuleUpdateCommand - 2019/12/06 06:05:00.337921 [DEBUG] http: Request GET /v1/acl/binding-rule/53cf4647-8f2d-7442-8a69-5ac011ab84ad (961.356µs) from=127.0.0.1:57108
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_but_description
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:00.421147 [DEBUG] http: Request DELETE /v1/acl/binding-rule/a1f99280-a5fe-2d51-1659-ee69dd23630c (229.534318ms) from=127.0.0.1:48658
TestBindingRuleUpdateCommand - 2019/12/06 06:05:00.574709 [DEBUG] http: Request PUT /v1/acl/binding-rule (229.632987ms) from=127.0.0.1:57108
TestBindingRuleUpdateCommand - 2019/12/06 06:05:00.583154 [DEBUG] http: Request GET /v1/acl/binding-rule/5ed5c7d4-b546-15cf-cd16-fa43325f469e (1.691705ms) from=127.0.0.1:57118
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:00.662630 [DEBUG] http: Request DELETE /v1/acl/binding-rule/c6d95e80-75d3-ff50-dde7-8d5b5cf853a0 (237.434835ms) from=127.0.0.1:48660
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:00.954384 [DEBUG] http: Request DELETE /v1/acl/binding-rule/c7ea4f87-091f-ded5-316d-164d93b8a200 (287.913337ms) from=127.0.0.1:48664
TestBindingRuleUpdateCommand - 2019/12/06 06:05:00.956484 [DEBUG] http: Request PUT /v1/acl/binding-rule/5ed5c7d4-b546-15cf-cd16-fa43325f469e (369.805567ms) from=127.0.0.1:57118
TestBindingRuleUpdateCommand - 2019/12/06 06:05:00.971675 [DEBUG] http: Request GET /v1/acl/binding-rule/5ed5c7d4-b546-15cf-cd16-fa43325f469e (1.142693ms) from=127.0.0.1:57108
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_but_bind_name
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:01.322999 [DEBUG] http: Request DELETE /v1/acl/binding-rule/d4ec88ce-c83a-ed7d-be41-fd87395ced3d (356.962937ms) from=127.0.0.1:48666
TestBindingRuleUpdateCommand - 2019/12/06 06:05:01.323101 [DEBUG] http: Request PUT /v1/acl/binding-rule (344.365312ms) from=127.0.0.1:57108
TestBindingRuleUpdateCommand - 2019/12/06 06:05:01.333401 [DEBUG] http: Request GET /v1/acl/binding-rule/815e9fee-1d73-cc41-5692-f0db9c188e2d (1.337031ms) from=127.0.0.1:57126
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:02.324550 [WARN] consul: error getting server health from "Node bdd0d84e-8270-1e56-c9a0-9c500a6262d7": context deadline exceeded
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:02.538525 [DEBUG] http: Request DELETE /v1/acl/binding-rule/d550c4fe-dd1d-4de7-db89-de516df40153 (1.208846674s) from=127.0.0.1:48668
TestBindingRuleUpdateCommand - 2019/12/06 06:05:02.538999 [DEBUG] http: Request PUT /v1/acl/binding-rule/815e9fee-1d73-cc41-5692-f0db9c188e2d (1.200454146s) from=127.0.0.1:57126
TestBindingRuleUpdateCommand - 2019/12/06 06:05:02.545444 [DEBUG] http: Request GET /v1/acl/binding-rule/815e9fee-1d73-cc41-5692-f0db9c188e2d (1.4457ms) from=127.0.0.1:57108
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_but_must_exist
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:02.722135 [DEBUG] http: Request DELETE /v1/acl/binding-rule/f4275170-cf5f-e06d-3680-abacf13666fd (179.8435ms) from=127.0.0.1:48672
TestBindingRuleUpdateCommand - 2019/12/06 06:05:02.723180 [DEBUG] http: Request PUT /v1/acl/binding-rule (172.864338ms) from=127.0.0.1:57108
TestBindingRuleUpdateCommand - 2019/12/06 06:05:02.737716 [DEBUG] http: Request GET /v1/acl/binding-rule/9e4bed42-849e-d003-a62a-ed6604c21d2a (4.106096ms) from=127.0.0.1:57132
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:02.906839 [DEBUG] http: Request PUT /v1/acl/binding-rule (177.84612ms) from=127.0.0.1:48674
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:02.919547 [DEBUG] http: Request GET /v1/acl/binding-rules (1.612037ms) from=127.0.0.1:48678
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:02.925094 [DEBUG] http: Request GET /v1/acl/binding-rule/a668dca3-c573-9766-078a-140001cb080e (1.344365ms) from=127.0.0.1:48678
TestBindingRuleUpdateCommand - 2019/12/06 06:05:03.021513 [DEBUG] http: Request PUT /v1/acl/binding-rule/9e4bed42-849e-d003-a62a-ed6604c21d2a (278.954463ms) from=127.0.0.1:57132
TestBindingRuleUpdateCommand - 2019/12/06 06:05:03.031516 [DEBUG] http: Request GET /v1/acl/binding-rule/9e4bed42-849e-d003-a62a-ed6604c21d2a (2.374055ms) from=127.0.0.1:57108
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_but_selector
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:03.213967 [DEBUG] http: Request PUT /v1/acl/binding-rule/a668dca3-c573-9766-078a-140001cb080e (280.381829ms) from=127.0.0.1:48678
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:03.221741 [DEBUG] http: Request GET /v1/acl/binding-rule/a668dca3-c573-9766-078a-140001cb080e (1.224028ms) from=127.0.0.1:48674
=== RUN   TestBindingRuleUpdateCommand_noMerge/update_all_fields_but_description
TestBindingRuleUpdateCommand - 2019/12/06 06:05:03.398875 [DEBUG] http: Request PUT /v1/acl/binding-rule (361.785382ms) from=127.0.0.1:57108
TestBindingRuleUpdateCommand - 2019/12/06 06:05:03.407545 [DEBUG] http: Request GET /v1/acl/binding-rule/d9ae1a6e-1805-52c5-d55b-267d679e5a76 (1.414366ms) from=127.0.0.1:57136
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:03.546251 [DEBUG] http: Request PUT /v1/acl/binding-rule (318.210039ms) from=127.0.0.1:48674
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:03.555554 [DEBUG] http: Request GET /v1/acl/binding-rule/9c903f6a-1399-7f3e-0910-41d9fe0976d0 (1.446367ms) from=127.0.0.1:48682
TestBindingRuleUpdateCommand - 2019/12/06 06:05:03.788227 [DEBUG] http: Request PUT /v1/acl/binding-rule/d9ae1a6e-1805-52c5-d55b-267d679e5a76 (377.850755ms) from=127.0.0.1:57136
TestBindingRuleUpdateCommand - 2019/12/06 06:05:03.793770 [DEBUG] http: Request GET /v1/acl/binding-rule/d9ae1a6e-1805-52c5-d55b-267d679e5a76 (2.16105ms) from=127.0.0.1:57108
=== RUN   TestBindingRuleUpdateCommand/update_all_fields_clear_selector
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:03.905515 [DEBUG] http: Request PUT /v1/acl/binding-rule/9c903f6a-1399-7f3e-0910-41d9fe0976d0 (346.853036ms) from=127.0.0.1:48682
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:03.909390 [DEBUG] http: Request GET /v1/acl/binding-rule/9c903f6a-1399-7f3e-0910-41d9fe0976d0 (996.356µs) from=127.0.0.1:48674
=== RUN   TestBindingRuleUpdateCommand_noMerge/missing_bind_name
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.046296 [DEBUG] http: Request PUT /v1/acl/binding-rule (249.428446ms) from=127.0.0.1:57108
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.058868 [DEBUG] http: Request GET /v1/acl/binding-rule/850065ca-c941-3155-4042-3f9c729be47a (1.04369ms) from=127.0.0.1:57140
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.379636 [DEBUG] http: Request PUT /v1/acl/binding-rule/850065ca-c941-3155-4042-3f9c729be47a (317.444688ms) from=127.0.0.1:57140
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:04.381501 [DEBUG] http: Request PUT /v1/acl/binding-rule (468.533189ms) from=127.0.0.1:48674
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.393246 [DEBUG] http: Request GET /v1/acl/binding-rule/850065ca-c941-3155-4042-3f9c729be47a (7.367837ms) from=127.0.0.1:57108
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.395759 [INFO] agent: Requesting shutdown
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.395877 [INFO] consul: shutting down server
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.395935 [WARN] serf: Shutdown without a Leave
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:04.397908 [DEBUG] http: Request GET /v1/acl/binding-rule/a9d32311-b2e0-b157-3b00-1f1b76b74bc9 (1.147027ms) from=127.0.0.1:48686
=== RUN   TestBindingRuleUpdateCommand_noMerge/update_all_fields_but_selector
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.604087 [WARN] serf: Shutdown without a Leave
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.736362 [INFO] manager: shutting down
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.737268 [INFO] agent: consul server down
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.737348 [INFO] agent: shutdown complete
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.737407 [INFO] agent: Stopping DNS server 127.0.0.1:26507 (tcp)
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.737604 [INFO] agent: Stopping DNS server 127.0.0.1:26507 (udp)
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.737808 [INFO] agent: Stopping HTTP server 127.0.0.1:26508 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:04.739714 [DEBUG] http: Request PUT /v1/acl/binding-rule (338.167835ms) from=127.0.0.1:48674
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.745486 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleUpdateCommand - 2019/12/06 06:05:04.745958 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleUpdateCommand (12.95s)
    --- PASS: TestBindingRuleUpdateCommand/rule_id_required (0.00s)
    --- PASS: TestBindingRuleUpdateCommand/rule_id_partial_matches_nothing (0.03s)
    --- PASS: TestBindingRuleUpdateCommand/rule_id_exact_match_doesn't_exist (0.01s)
    --- PASS: TestBindingRuleUpdateCommand/rule_id_partial_matches_multiple (1.10s)
    --- PASS: TestBindingRuleUpdateCommand/must_use_roughly_valid_selector (0.29s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields (0.42s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_-_partial (1.96s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_but_description (0.63s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_but_bind_name (1.57s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_but_must_exist (0.49s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_but_selector (0.76s)
    --- PASS: TestBindingRuleUpdateCommand/update_all_fields_clear_selector (0.60s)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:04.751834 [DEBUG] http: Request GET /v1/acl/binding-rule/71ecd04f-374b-1155-fe4c-f3bf32795a1e (3.305744ms) from=127.0.0.1:48688
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.088024 [DEBUG] http: Request PUT /v1/acl/binding-rule/71ecd04f-374b-1155-fe4c-f3bf32795a1e (333.033049ms) from=127.0.0.1:48688
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.092738 [DEBUG] http: Request GET /v1/acl/binding-rule/71ecd04f-374b-1155-fe4c-f3bf32795a1e (1.051358ms) from=127.0.0.1:48674
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.094764 [INFO] agent: Requesting shutdown
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.094864 [INFO] consul: shutting down server
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.094918 [WARN] serf: Shutdown without a Leave
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.202949 [WARN] serf: Shutdown without a Leave
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.329393 [INFO] manager: shutting down
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.330174 [INFO] agent: consul server down
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.330246 [INFO] agent: shutdown complete
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.330322 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.330507 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (udp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.330699 [INFO] agent: Stopping HTTP server 127.0.0.1:26502 (tcp)
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.334830 [INFO] agent: Waiting for endpoints to shut down
TestBindingRuleUpdateCommand_noMerge - 2019/12/06 06:05:05.334941 [INFO] agent: Endpoints down
--- PASS: TestBindingRuleUpdateCommand_noMerge (13.54s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/rule_id_required (0.03s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/rule_id_partial_matches_nothing (0.02s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/rule_id_exact_match_doesn't_exist (0.01s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/rule_id_partial_matches_multiple (1.93s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/must_use_roughly_valid_selector (0.27s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/update_all_fields (0.49s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/update_all_fields_-_partial (3.81s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/update_all_fields_but_description (0.68s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/missing_bind_name (0.49s)
    --- PASS: TestBindingRuleUpdateCommand_noMerge/update_all_fields_but_selector (0.69s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bindingrule/update	13.864s
=== RUN   TestBootstrapCommand_noTabs
=== PAUSE TestBootstrapCommand_noTabs
=== RUN   TestBootstrapCommand
=== PAUSE TestBootstrapCommand
=== CONT  TestBootstrapCommand_noTabs
=== CONT  TestBootstrapCommand
--- PASS: TestBootstrapCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestBootstrapCommand - 2019/12/06 06:06:15.505719 [WARN] agent: Node name "Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBootstrapCommand - 2019/12/06 06:06:16.240577 [DEBUG] tlsutil: Update with version 1
TestBootstrapCommand - 2019/12/06 06:06:16.246685 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:06:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:56b6f251-7a2e-04f1-6c72-ed4291a1f569 Address:127.0.0.1:17506}]
TestBootstrapCommand - 2019/12/06 06:06:19.394742 [INFO] serf: EventMemberJoin: Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569.dc1 127.0.0.1
2019/12/06 06:06:19 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestBootstrapCommand - 2019/12/06 06:06:19.401545 [INFO] serf: EventMemberJoin: Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569 127.0.0.1
TestBootstrapCommand - 2019/12/06 06:06:19.403869 [INFO] consul: Adding LAN server Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569 (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestBootstrapCommand - 2019/12/06 06:06:19.404147 [INFO] consul: Handled member-join event for server "Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569.dc1" in area "wan"
TestBootstrapCommand - 2019/12/06 06:06:19.404390 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestBootstrapCommand - 2019/12/06 06:06:19.404930 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestBootstrapCommand - 2019/12/06 06:06:19.407529 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestBootstrapCommand - 2019/12/06 06:06:19.407682 [INFO] agent: started state syncer
2019/12/06 06:06:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:19 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/12/06 06:06:20 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:20 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestBootstrapCommand - 2019/12/06 06:06:20.795400 [INFO] consul: cluster leadership acquired
TestBootstrapCommand - 2019/12/06 06:06:20.795925 [INFO] consul: New leader elected: Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569
TestBootstrapCommand - 2019/12/06 06:06:20.954706 [INFO] acl: initializing acls
TestBootstrapCommand - 2019/12/06 06:06:20.979656 [ERR] agent: failed to sync remote state: ACL not found
TestBootstrapCommand - 2019/12/06 06:06:21.049825 [ERR] agent: failed to sync remote state: ACL not found
TestBootstrapCommand - 2019/12/06 06:06:21.746913 [INFO] consul: Created ACL 'global-management' policy
TestBootstrapCommand - 2019/12/06 06:06:21.752689 [INFO] acl: initializing acls
TestBootstrapCommand - 2019/12/06 06:06:22.890551 [INFO] consul: Created ACL anonymous token from configuration
TestBootstrapCommand - 2019/12/06 06:06:22.890831 [INFO] consul: Created ACL anonymous token from configuration
TestBootstrapCommand - 2019/12/06 06:06:22.890898 [DEBUG] acl: transitioning out of legacy ACL mode
TestBootstrapCommand - 2019/12/06 06:06:22.891417 [INFO] serf: EventMemberUpdate: Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569
TestBootstrapCommand - 2019/12/06 06:06:22.892054 [INFO] serf: EventMemberUpdate: Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569.dc1
TestBootstrapCommand - 2019/12/06 06:06:22.894867 [INFO] serf: EventMemberUpdate: Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569
TestBootstrapCommand - 2019/12/06 06:06:22.895538 [INFO] serf: EventMemberUpdate: Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569.dc1
TestBootstrapCommand - 2019/12/06 06:06:24.900302 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBootstrapCommand - 2019/12/06 06:06:24.900799 [DEBUG] consul: Skipping self join check for "Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569" since the cluster is too small
TestBootstrapCommand - 2019/12/06 06:06:24.900900 [INFO] consul: member 'Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569' joined, marking health alive
TestBootstrapCommand - 2019/12/06 06:06:25.208680 [DEBUG] consul: Skipping self join check for "Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569" since the cluster is too small
TestBootstrapCommand - 2019/12/06 06:06:25.218060 [DEBUG] consul: Skipping self join check for "Node 56b6f251-7a2e-04f1-6c72-ed4291a1f569" since the cluster is too small
TestBootstrapCommand - 2019/12/06 06:06:25.225754 [WARN] acl.bootstrap: failed to remove bootstrap file: remove /tmp/TestBootstrapCommand-agent222654984/acl-bootstrap-reset: no such file or directory
TestBootstrapCommand - 2019/12/06 06:06:25.621543 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBootstrapCommand - 2019/12/06 06:06:25.623492 [INFO] consul.acl: ACL bootstrap completed
TestBootstrapCommand - 2019/12/06 06:06:25.625960 [DEBUG] http: Request PUT /v1/acl/bootstrap (401.027293ms) from=127.0.0.1:43044
TestBootstrapCommand - 2019/12/06 06:06:25.635644 [INFO] agent: Requesting shutdown
TestBootstrapCommand - 2019/12/06 06:06:25.635745 [INFO] consul: shutting down server
TestBootstrapCommand - 2019/12/06 06:06:25.635801 [WARN] serf: Shutdown without a Leave
TestBootstrapCommand - 2019/12/06 06:06:25.755960 [WARN] serf: Shutdown without a Leave
TestBootstrapCommand - 2019/12/06 06:06:25.888743 [INFO] manager: shutting down
TestBootstrapCommand - 2019/12/06 06:06:25.889678 [INFO] agent: consul server down
TestBootstrapCommand - 2019/12/06 06:06:25.889750 [INFO] agent: shutdown complete
TestBootstrapCommand - 2019/12/06 06:06:25.889828 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestBootstrapCommand - 2019/12/06 06:06:25.890003 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestBootstrapCommand - 2019/12/06 06:06:25.890189 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestBootstrapCommand - 2019/12/06 06:06:25.890730 [INFO] agent: Waiting for endpoints to shut down
TestBootstrapCommand - 2019/12/06 06:06:25.890839 [INFO] agent: Endpoints down
--- PASS: TestBootstrapCommand (10.68s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bootstrap	11.531s
?   	github.com/hashicorp/consul/command/acl/policy	[no test files]
=== RUN   TestPolicyCreateCommand_noTabs
=== PAUSE TestPolicyCreateCommand_noTabs
=== RUN   TestPolicyCreateCommand
=== PAUSE TestPolicyCreateCommand
=== CONT  TestPolicyCreateCommand_noTabs
=== CONT  TestPolicyCreateCommand
--- PASS: TestPolicyCreateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyCreateCommand - 2019/12/06 06:06:21.853884 [WARN] agent: Node name "Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyCreateCommand - 2019/12/06 06:06:21.854703 [DEBUG] tlsutil: Update with version 1
TestPolicyCreateCommand - 2019/12/06 06:06:21.864451 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:06:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ad083ae6-ca6c-57ab-b6b0-cb70a39828dc Address:127.0.0.1:20506}]
2019/12/06 06:06:23 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestPolicyCreateCommand - 2019/12/06 06:06:23.693207 [INFO] serf: EventMemberJoin: Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc.dc1 127.0.0.1
TestPolicyCreateCommand - 2019/12/06 06:06:23.696539 [INFO] serf: EventMemberJoin: Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc 127.0.0.1
TestPolicyCreateCommand - 2019/12/06 06:06:23.697353 [INFO] consul: Adding LAN server Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestPolicyCreateCommand - 2019/12/06 06:06:23.697627 [INFO] consul: Handled member-join event for server "Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc.dc1" in area "wan"
TestPolicyCreateCommand - 2019/12/06 06:06:23.698113 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestPolicyCreateCommand - 2019/12/06 06:06:23.698178 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestPolicyCreateCommand - 2019/12/06 06:06:23.701371 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestPolicyCreateCommand - 2019/12/06 06:06:23.701562 [INFO] agent: started state syncer
2019/12/06 06:06:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:23 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/06 06:06:25 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:25 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestPolicyCreateCommand - 2019/12/06 06:06:25.021501 [INFO] consul: cluster leadership acquired
TestPolicyCreateCommand - 2019/12/06 06:06:25.022050 [INFO] consul: New leader elected: Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc
TestPolicyCreateCommand - 2019/12/06 06:06:25.161193 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyCreateCommand - 2019/12/06 06:06:25.249330 [INFO] acl: initializing acls
TestPolicyCreateCommand - 2019/12/06 06:06:25.755809 [INFO] acl: initializing acls
TestPolicyCreateCommand - 2019/12/06 06:06:25.757416 [INFO] consul: Created ACL 'global-management' policy
TestPolicyCreateCommand - 2019/12/06 06:06:25.757501 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyCreateCommand - 2019/12/06 06:06:26.689507 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyCreateCommand - 2019/12/06 06:06:26.690387 [INFO] consul: Created ACL 'global-management' policy
TestPolicyCreateCommand - 2019/12/06 06:06:26.690546 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyCreateCommand - 2019/12/06 06:06:26.900287 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyCreateCommand - 2019/12/06 06:06:27.790034 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyCreateCommand - 2019/12/06 06:06:27.790177 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyCreateCommand - 2019/12/06 06:06:27.790060 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyCreateCommand - 2019/12/06 06:06:27.791045 [INFO] serf: EventMemberUpdate: Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc
TestPolicyCreateCommand - 2019/12/06 06:06:27.791637 [INFO] serf: EventMemberUpdate: Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc.dc1
TestPolicyCreateCommand - 2019/12/06 06:06:27.791957 [INFO] serf: EventMemberUpdate: Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc
TestPolicyCreateCommand - 2019/12/06 06:06:27.794992 [INFO] serf: EventMemberUpdate: Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc.dc1
TestPolicyCreateCommand - 2019/12/06 06:06:30.180102 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyCreateCommand - 2019/12/06 06:06:30.180652 [DEBUG] consul: Skipping self join check for "Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc" since the cluster is too small
TestPolicyCreateCommand - 2019/12/06 06:06:30.180773 [INFO] consul: member 'Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc' joined, marking health alive
TestPolicyCreateCommand - 2019/12/06 06:06:30.596144 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPolicyCreateCommand - 2019/12/06 06:06:30.599698 [DEBUG] consul: Skipping self join check for "Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc" since the cluster is too small
TestPolicyCreateCommand - 2019/12/06 06:06:30.600313 [DEBUG] consul: Skipping self join check for "Node ad083ae6-ca6c-57ab-b6b0-cb70a39828dc" since the cluster is too small
TestPolicyCreateCommand - 2019/12/06 06:06:31.139898 [DEBUG] http: Request PUT /v1/acl/policy (533.97504ms) from=127.0.0.1:48086
TestPolicyCreateCommand - 2019/12/06 06:06:31.146486 [INFO] agent: Requesting shutdown
TestPolicyCreateCommand - 2019/12/06 06:06:31.146589 [INFO] consul: shutting down server
TestPolicyCreateCommand - 2019/12/06 06:06:31.146678 [WARN] serf: Shutdown without a Leave
TestPolicyCreateCommand - 2019/12/06 06:06:31.320876 [WARN] serf: Shutdown without a Leave
TestPolicyCreateCommand - 2019/12/06 06:06:31.479254 [INFO] manager: shutting down
TestPolicyCreateCommand - 2019/12/06 06:06:31.481308 [INFO] agent: consul server down
TestPolicyCreateCommand - 2019/12/06 06:06:31.481377 [INFO] agent: shutdown complete
TestPolicyCreateCommand - 2019/12/06 06:06:31.481450 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestPolicyCreateCommand - 2019/12/06 06:06:31.481775 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestPolicyCreateCommand - 2019/12/06 06:06:31.482113 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestPolicyCreateCommand - 2019/12/06 06:06:31.482843 [INFO] agent: Waiting for endpoints to shut down
TestPolicyCreateCommand - 2019/12/06 06:06:31.482931 [INFO] agent: Endpoints down
--- PASS: TestPolicyCreateCommand (9.70s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/create	9.965s
=== RUN   TestPolicyDeleteCommand_noTabs
=== PAUSE TestPolicyDeleteCommand_noTabs
=== RUN   TestPolicyDeleteCommand
=== PAUSE TestPolicyDeleteCommand
=== CONT  TestPolicyDeleteCommand_noTabs
=== CONT  TestPolicyDeleteCommand
--- PASS: TestPolicyDeleteCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyDeleteCommand - 2019/12/06 06:06:25.348537 [WARN] agent: Node name "Node cd28d93f-e18e-8408-1e96-7b16d860f0c8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyDeleteCommand - 2019/12/06 06:06:25.349315 [DEBUG] tlsutil: Update with version 1
TestPolicyDeleteCommand - 2019/12/06 06:06:25.370231 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:06:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cd28d93f-e18e-8408-1e96-7b16d860f0c8 Address:127.0.0.1:31006}]
2019/12/06 06:06:27 [INFO]  raft: Node at 127.0.0.1:31006 [Follower] entering Follower state (Leader: "")
TestPolicyDeleteCommand - 2019/12/06 06:06:27.208736 [INFO] serf: EventMemberJoin: Node cd28d93f-e18e-8408-1e96-7b16d860f0c8.dc1 127.0.0.1
TestPolicyDeleteCommand - 2019/12/06 06:06:27.212231 [INFO] serf: EventMemberJoin: Node cd28d93f-e18e-8408-1e96-7b16d860f0c8 127.0.0.1
TestPolicyDeleteCommand - 2019/12/06 06:06:27.213189 [INFO] consul: Adding LAN server Node cd28d93f-e18e-8408-1e96-7b16d860f0c8 (Addr: tcp/127.0.0.1:31006) (DC: dc1)
TestPolicyDeleteCommand - 2019/12/06 06:06:27.213637 [INFO] consul: Handled member-join event for server "Node cd28d93f-e18e-8408-1e96-7b16d860f0c8.dc1" in area "wan"
TestPolicyDeleteCommand - 2019/12/06 06:06:27.214362 [INFO] agent: Started DNS server 127.0.0.1:31001 (udp)
TestPolicyDeleteCommand - 2019/12/06 06:06:27.214463 [INFO] agent: Started DNS server 127.0.0.1:31001 (tcp)
TestPolicyDeleteCommand - 2019/12/06 06:06:27.217037 [INFO] agent: Started HTTP server on 127.0.0.1:31002 (tcp)
TestPolicyDeleteCommand - 2019/12/06 06:06:27.217158 [INFO] agent: started state syncer
2019/12/06 06:06:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:06:27 [INFO]  raft: Node at 127.0.0.1:31006 [Candidate] entering Candidate state in term 2
2019/12/06 06:06:28 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:06:28 [INFO]  raft: Node at 127.0.0.1:31006 [Leader] entering Leader state
TestPolicyDeleteCommand - 2019/12/06 06:06:28.912980 [INFO] consul: cluster leadership acquired
TestPolicyDeleteCommand - 2019/12/06 06:06:28.913537 [INFO] consul: New leader elected: Node cd28d93f-e18e-8408-1e96-7b16d860f0c8
TestPolicyDeleteCommand - 2019/12/06 06:06:29.144839 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyDeleteCommand - 2019/12/06 06:06:29.842183 [INFO] acl: initializing acls
TestPolicyDeleteCommand - 2019/12/06 06:06:30.321706 [INFO] consul: Created ACL 'global-management' policy
TestPolicyDeleteCommand - 2019/12/06 06:06:30.321780 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyDeleteCommand - 2019/12/06 06:06:30.365600 [INFO] acl: initializing acls
TestPolicyDeleteCommand - 2019/12/06 06:06:30.365733 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyDeleteCommand - 2019/12/06 06:06:30.692325 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyDeleteCommand - 2019/12/06 06:06:31.336760 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyDeleteCommand - 2019/12/06 06:06:31.404714 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyDeleteCommand - 2019/12/06 06:06:31.832181 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyDeleteCommand - 2019/12/06 06:06:31.832886 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyDeleteCommand - 2019/12/06 06:06:31.832947 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyDeleteCommand - 2019/12/06 06:06:31.833088 [INFO] serf: EventMemberUpdate: Node cd28d93f-e18e-8408-1e96-7b16d860f0c8
TestPolicyDeleteCommand - 2019/12/06 06:06:31.833730 [INFO] serf: EventMemberUpdate: Node cd28d93f-e18e-8408-1e96-7b16d860f0c8.dc1
TestPolicyDeleteCommand - 2019/12/06 06:06:31.835765 [INFO] serf: EventMemberUpdate: Node cd28d93f-e18e-8408-1e96-7b16d860f0c8
TestPolicyDeleteCommand - 2019/12/06 06:06:31.836604 [INFO] serf: EventMemberUpdate: Node cd28d93f-e18e-8408-1e96-7b16d860f0c8.dc1
TestPolicyDeleteCommand - 2019/12/06 06:06:32.954976 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyDeleteCommand - 2019/12/06 06:06:32.955721 [DEBUG] consul: Skipping self join check for "Node cd28d93f-e18e-8408-1e96-7b16d860f0c8" since the cluster is too small
TestPolicyDeleteCommand - 2019/12/06 06:06:32.955928 [INFO] consul: member 'Node cd28d93f-e18e-8408-1e96-7b16d860f0c8' joined, marking health alive
TestPolicyDeleteCommand - 2019/12/06 06:06:33.173301 [DEBUG] consul: Skipping self join check for "Node cd28d93f-e18e-8408-1e96-7b16d860f0c8" since the cluster is too small
TestPolicyDeleteCommand - 2019/12/06 06:06:33.174053 [DEBUG] consul: Skipping self join check for "Node cd28d93f-e18e-8408-1e96-7b16d860f0c8" since the cluster is too small
TestPolicyDeleteCommand - 2019/12/06 06:06:33.381234 [DEBUG] http: Request PUT /v1/acl/policy (199.762296ms) from=127.0.0.1:60218
TestPolicyDeleteCommand - 2019/12/06 06:06:33.694637 [DEBUG] http: Request DELETE /v1/acl/policy/94d989e3-7ff5-439f-34bc-472abc490e11 (308.146474ms) from=127.0.0.1:60220
TestPolicyDeleteCommand - 2019/12/06 06:06:33.697914 [ERR] http: Request GET /v1/acl/policy/94d989e3-7ff5-439f-34bc-472abc490e11, error: ACL not found from=127.0.0.1:60218
TestPolicyDeleteCommand - 2019/12/06 06:06:33.699873 [DEBUG] http: Request GET /v1/acl/policy/94d989e3-7ff5-439f-34bc-472abc490e11 (2.418722ms) from=127.0.0.1:60218
TestPolicyDeleteCommand - 2019/12/06 06:06:33.701753 [INFO] agent: Requesting shutdown
TestPolicyDeleteCommand - 2019/12/06 06:06:33.701838 [INFO] consul: shutting down server
TestPolicyDeleteCommand - 2019/12/06 06:06:33.701886 [WARN] serf: Shutdown without a Leave
TestPolicyDeleteCommand - 2019/12/06 06:06:33.787923 [WARN] serf: Shutdown without a Leave
TestPolicyDeleteCommand - 2019/12/06 06:06:33.862810 [INFO] manager: shutting down
TestPolicyDeleteCommand - 2019/12/06 06:06:33.863175 [INFO] agent: consul server down
TestPolicyDeleteCommand - 2019/12/06 06:06:33.863230 [INFO] agent: shutdown complete
TestPolicyDeleteCommand - 2019/12/06 06:06:33.863287 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (tcp)
TestPolicyDeleteCommand - 2019/12/06 06:06:33.863440 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (udp)
TestPolicyDeleteCommand - 2019/12/06 06:06:33.863594 [INFO] agent: Stopping HTTP server 127.0.0.1:31002 (tcp)
TestPolicyDeleteCommand - 2019/12/06 06:06:33.864243 [INFO] agent: Waiting for endpoints to shut down
TestPolicyDeleteCommand - 2019/12/06 06:06:33.864363 [INFO] agent: Endpoints down
--- PASS: TestPolicyDeleteCommand (8.61s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/delete	8.863s
=== RUN   TestPolicyListCommand_noTabs
=== PAUSE TestPolicyListCommand_noTabs
=== RUN   TestPolicyListCommand
=== PAUSE TestPolicyListCommand
=== CONT  TestPolicyListCommand_noTabs
=== CONT  TestPolicyListCommand
--- PASS: TestPolicyListCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyListCommand - 2019/12/06 06:12:22.482269 [WARN] agent: Node name "Node 5835cd8b-547f-319e-f77b-af478e23c155" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyListCommand - 2019/12/06 06:12:22.626048 [DEBUG] tlsutil: Update with version 1
TestPolicyListCommand - 2019/12/06 06:12:22.651502 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:12:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5835cd8b-547f-319e-f77b-af478e23c155 Address:127.0.0.1:22006}]
2019/12/06 06:12:25 [INFO]  raft: Node at 127.0.0.1:22006 [Follower] entering Follower state (Leader: "")
TestPolicyListCommand - 2019/12/06 06:12:25.124367 [INFO] serf: EventMemberJoin: Node 5835cd8b-547f-319e-f77b-af478e23c155.dc1 127.0.0.1
TestPolicyListCommand - 2019/12/06 06:12:25.128100 [INFO] serf: EventMemberJoin: Node 5835cd8b-547f-319e-f77b-af478e23c155 127.0.0.1
TestPolicyListCommand - 2019/12/06 06:12:25.129636 [INFO] consul: Handled member-join event for server "Node 5835cd8b-547f-319e-f77b-af478e23c155.dc1" in area "wan"
TestPolicyListCommand - 2019/12/06 06:12:25.130073 [INFO] consul: Adding LAN server Node 5835cd8b-547f-319e-f77b-af478e23c155 (Addr: tcp/127.0.0.1:22006) (DC: dc1)
TestPolicyListCommand - 2019/12/06 06:12:25.130596 [INFO] agent: Started DNS server 127.0.0.1:22001 (udp)
TestPolicyListCommand - 2019/12/06 06:12:25.130687 [INFO] agent: Started DNS server 127.0.0.1:22001 (tcp)
TestPolicyListCommand - 2019/12/06 06:12:25.133186 [INFO] agent: Started HTTP server on 127.0.0.1:22002 (tcp)
TestPolicyListCommand - 2019/12/06 06:12:25.133322 [INFO] agent: started state syncer
2019/12/06 06:12:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:12:25 [INFO]  raft: Node at 127.0.0.1:22006 [Candidate] entering Candidate state in term 2
2019/12/06 06:12:26 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:12:26 [INFO]  raft: Node at 127.0.0.1:22006 [Leader] entering Leader state
TestPolicyListCommand - 2019/12/06 06:12:26.018999 [INFO] consul: cluster leadership acquired
TestPolicyListCommand - 2019/12/06 06:12:26.019793 [INFO] consul: New leader elected: Node 5835cd8b-547f-319e-f77b-af478e23c155
TestPolicyListCommand - 2019/12/06 06:12:26.100926 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyListCommand - 2019/12/06 06:12:26.385827 [INFO] acl: initializing acls
TestPolicyListCommand - 2019/12/06 06:12:26.681414 [INFO] acl: initializing acls
TestPolicyListCommand - 2019/12/06 06:12:26.852565 [INFO] consul: Created ACL 'global-management' policy
TestPolicyListCommand - 2019/12/06 06:12:26.852673 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyListCommand - 2019/12/06 06:12:26.853002 [INFO] consul: Created ACL 'global-management' policy
TestPolicyListCommand - 2019/12/06 06:12:26.853062 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyListCommand - 2019/12/06 06:12:27.123348 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyListCommand - 2019/12/06 06:12:27.350121 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyListCommand - 2019/12/06 06:12:28.045527 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyListCommand - 2019/12/06 06:12:28.045645 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyListCommand - 2019/12/06 06:12:28.046926 [INFO] serf: EventMemberUpdate: Node 5835cd8b-547f-319e-f77b-af478e23c155
TestPolicyListCommand - 2019/12/06 06:12:28.047048 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyListCommand - 2019/12/06 06:12:28.048254 [INFO] serf: EventMemberUpdate: Node 5835cd8b-547f-319e-f77b-af478e23c155
TestPolicyListCommand - 2019/12/06 06:12:28.049363 [INFO] serf: EventMemberUpdate: Node 5835cd8b-547f-319e-f77b-af478e23c155.dc1
TestPolicyListCommand - 2019/12/06 06:12:28.050153 [INFO] serf: EventMemberUpdate: Node 5835cd8b-547f-319e-f77b-af478e23c155.dc1
TestPolicyListCommand - 2019/12/06 06:12:29.177556 [INFO] agent: Synced node info
TestPolicyListCommand - 2019/12/06 06:12:29.177700 [DEBUG] agent: Node info in sync
TestPolicyListCommand - 2019/12/06 06:12:30.145671 [DEBUG] http: Request PUT /v1/acl/policy (941.911176ms) from=127.0.0.1:43298
TestPolicyListCommand - 2019/12/06 06:12:30.511196 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyListCommand - 2019/12/06 06:12:30.801878 [DEBUG] consul: Skipping self join check for "Node 5835cd8b-547f-319e-f77b-af478e23c155" since the cluster is too small
TestPolicyListCommand - 2019/12/06 06:12:30.802004 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPolicyListCommand - 2019/12/06 06:12:30.802085 [INFO] consul: member 'Node 5835cd8b-547f-319e-f77b-af478e23c155' joined, marking health alive
TestPolicyListCommand - 2019/12/06 06:12:30.809652 [DEBUG] http: Request PUT /v1/acl/policy (660.093641ms) from=127.0.0.1:43298
TestPolicyListCommand - 2019/12/06 06:12:31.735182 [DEBUG] consul: Skipping self join check for "Node 5835cd8b-547f-319e-f77b-af478e23c155" since the cluster is too small
TestPolicyListCommand - 2019/12/06 06:12:31.735798 [DEBUG] consul: Skipping self join check for "Node 5835cd8b-547f-319e-f77b-af478e23c155" since the cluster is too small
TestPolicyListCommand - 2019/12/06 06:12:31.737531 [DEBUG] http: Request PUT /v1/acl/policy (919.418988ms) from=127.0.0.1:43298
TestPolicyListCommand - 2019/12/06 06:12:32.045251 [DEBUG] http: Request PUT /v1/acl/policy (304.637398ms) from=127.0.0.1:43298
TestPolicyListCommand - 2019/12/06 06:12:32.294422 [DEBUG] http: Request PUT /v1/acl/policy (243.593315ms) from=127.0.0.1:43298
TestPolicyListCommand - 2019/12/06 06:12:32.299457 [DEBUG] http: Request GET /v1/acl/policies (1.814709ms) from=127.0.0.1:43306
TestPolicyListCommand - 2019/12/06 06:12:32.302834 [INFO] agent: Requesting shutdown
TestPolicyListCommand - 2019/12/06 06:12:32.302934 [INFO] consul: shutting down server
TestPolicyListCommand - 2019/12/06 06:12:32.302996 [WARN] serf: Shutdown without a Leave
TestPolicyListCommand - 2019/12/06 06:12:32.426609 [WARN] serf: Shutdown without a Leave
TestPolicyListCommand - 2019/12/06 06:12:32.493416 [INFO] manager: shutting down
TestPolicyListCommand - 2019/12/06 06:12:32.494353 [INFO] agent: consul server down
TestPolicyListCommand - 2019/12/06 06:12:32.494423 [INFO] agent: shutdown complete
TestPolicyListCommand - 2019/12/06 06:12:32.494477 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (tcp)
TestPolicyListCommand - 2019/12/06 06:12:32.494615 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (udp)
TestPolicyListCommand - 2019/12/06 06:12:32.494782 [INFO] agent: Stopping HTTP server 127.0.0.1:22002 (tcp)
TestPolicyListCommand - 2019/12/06 06:12:32.495378 [INFO] agent: Waiting for endpoints to shut down
TestPolicyListCommand - 2019/12/06 06:12:32.495456 [INFO] agent: Endpoints down
--- PASS: TestPolicyListCommand (10.37s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/list	11.116s
=== RUN   TestPolicyReadCommand_noTabs
=== PAUSE TestPolicyReadCommand_noTabs
=== RUN   TestPolicyReadCommand
=== PAUSE TestPolicyReadCommand
=== CONT  TestPolicyReadCommand_noTabs
=== CONT  TestPolicyReadCommand
--- PASS: TestPolicyReadCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyReadCommand - 2019/12/06 06:12:22.532843 [WARN] agent: Node name "Node 2b47994c-24f4-7eb3-e654-85c7a2856379" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyReadCommand - 2019/12/06 06:12:22.624922 [DEBUG] tlsutil: Update with version 1
TestPolicyReadCommand - 2019/12/06 06:12:22.639666 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:12:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2b47994c-24f4-7eb3-e654-85c7a2856379 Address:127.0.0.1:14506}]
2019/12/06 06:12:24 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestPolicyReadCommand - 2019/12/06 06:12:24.917542 [INFO] serf: EventMemberJoin: Node 2b47994c-24f4-7eb3-e654-85c7a2856379.dc1 127.0.0.1
TestPolicyReadCommand - 2019/12/06 06:12:24.920907 [INFO] serf: EventMemberJoin: Node 2b47994c-24f4-7eb3-e654-85c7a2856379 127.0.0.1
TestPolicyReadCommand - 2019/12/06 06:12:24.922141 [INFO] consul: Adding LAN server Node 2b47994c-24f4-7eb3-e654-85c7a2856379 (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestPolicyReadCommand - 2019/12/06 06:12:24.922395 [INFO] consul: Handled member-join event for server "Node 2b47994c-24f4-7eb3-e654-85c7a2856379.dc1" in area "wan"
TestPolicyReadCommand - 2019/12/06 06:12:24.923795 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestPolicyReadCommand - 2019/12/06 06:12:24.924248 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestPolicyReadCommand - 2019/12/06 06:12:24.926567 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestPolicyReadCommand - 2019/12/06 06:12:24.926691 [INFO] agent: started state syncer
2019/12/06 06:12:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:12:24 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/06 06:12:25 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:12:25 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestPolicyReadCommand - 2019/12/06 06:12:25.847601 [INFO] consul: cluster leadership acquired
TestPolicyReadCommand - 2019/12/06 06:12:25.848223 [INFO] consul: New leader elected: Node 2b47994c-24f4-7eb3-e654-85c7a2856379
TestPolicyReadCommand - 2019/12/06 06:12:26.146918 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyReadCommand - 2019/12/06 06:12:26.152716 [INFO] acl: initializing acls
TestPolicyReadCommand - 2019/12/06 06:12:26.386476 [INFO] consul: Created ACL 'global-management' policy
TestPolicyReadCommand - 2019/12/06 06:12:26.386570 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyReadCommand - 2019/12/06 06:12:26.475624 [INFO] acl: initializing acls
TestPolicyReadCommand - 2019/12/06 06:12:26.475750 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyReadCommand - 2019/12/06 06:12:26.653632 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyReadCommand - 2019/12/06 06:12:26.704896 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyReadCommand - 2019/12/06 06:12:27.119260 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyReadCommand - 2019/12/06 06:12:27.119974 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyReadCommand - 2019/12/06 06:12:27.120975 [INFO] serf: EventMemberUpdate: Node 2b47994c-24f4-7eb3-e654-85c7a2856379
TestPolicyReadCommand - 2019/12/06 06:12:27.121795 [INFO] serf: EventMemberUpdate: Node 2b47994c-24f4-7eb3-e654-85c7a2856379.dc1
TestPolicyReadCommand - 2019/12/06 06:12:27.345996 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyReadCommand - 2019/12/06 06:12:27.346081 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyReadCommand - 2019/12/06 06:12:27.347039 [INFO] serf: EventMemberUpdate: Node 2b47994c-24f4-7eb3-e654-85c7a2856379
TestPolicyReadCommand - 2019/12/06 06:12:27.348320 [INFO] serf: EventMemberUpdate: Node 2b47994c-24f4-7eb3-e654-85c7a2856379.dc1
TestPolicyReadCommand - 2019/12/06 06:12:28.469276 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyReadCommand - 2019/12/06 06:12:28.471233 [DEBUG] consul: Skipping self join check for "Node 2b47994c-24f4-7eb3-e654-85c7a2856379" since the cluster is too small
TestPolicyReadCommand - 2019/12/06 06:12:28.479367 [INFO] consul: member 'Node 2b47994c-24f4-7eb3-e654-85c7a2856379' joined, marking health alive
TestPolicyReadCommand - 2019/12/06 06:12:28.874951 [DEBUG] consul: Skipping self join check for "Node 2b47994c-24f4-7eb3-e654-85c7a2856379" since the cluster is too small
TestPolicyReadCommand - 2019/12/06 06:12:28.875617 [DEBUG] consul: Skipping self join check for "Node 2b47994c-24f4-7eb3-e654-85c7a2856379" since the cluster is too small
TestPolicyReadCommand - 2019/12/06 06:12:29.179736 [DEBUG] http: Request PUT /v1/acl/policy (295.085509ms) from=127.0.0.1:40458
TestPolicyReadCommand - 2019/12/06 06:12:29.199521 [DEBUG] http: Request GET /v1/acl/policy/8a85d68a-8672-0bea-f7be-97b3cfba27e0 (13.685318ms) from=127.0.0.1:40460
TestPolicyReadCommand - 2019/12/06 06:12:29.203703 [INFO] agent: Requesting shutdown
TestPolicyReadCommand - 2019/12/06 06:12:29.203934 [INFO] consul: shutting down server
TestPolicyReadCommand - 2019/12/06 06:12:29.204052 [WARN] serf: Shutdown without a Leave
TestPolicyReadCommand - 2019/12/06 06:12:29.526576 [WARN] serf: Shutdown without a Leave
TestPolicyReadCommand - 2019/12/06 06:12:29.707166 [INFO] manager: shutting down
TestPolicyReadCommand - 2019/12/06 06:12:29.707582 [INFO] agent: consul server down
TestPolicyReadCommand - 2019/12/06 06:12:29.707632 [INFO] agent: shutdown complete
TestPolicyReadCommand - 2019/12/06 06:12:29.707690 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestPolicyReadCommand - 2019/12/06 06:12:29.707811 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestPolicyReadCommand - 2019/12/06 06:12:29.707947 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestPolicyReadCommand - 2019/12/06 06:12:29.708585 [INFO] agent: Waiting for endpoints to shut down
TestPolicyReadCommand - 2019/12/06 06:12:29.708738 [INFO] agent: Endpoints down
--- PASS: TestPolicyReadCommand (7.59s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/read	8.043s
=== RUN   TestPolicyUpdateCommand_noTabs
=== PAUSE TestPolicyUpdateCommand_noTabs
=== RUN   TestPolicyUpdateCommand
=== PAUSE TestPolicyUpdateCommand
=== CONT  TestPolicyUpdateCommand_noTabs
--- PASS: TestPolicyUpdateCommand_noTabs (0.00s)
=== CONT  TestPolicyUpdateCommand
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyUpdateCommand - 2019/12/06 06:12:22.516534 [WARN] agent: Node name "Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyUpdateCommand - 2019/12/06 06:12:22.625590 [DEBUG] tlsutil: Update with version 1
TestPolicyUpdateCommand - 2019/12/06 06:12:22.662901 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:12:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3313fe5c-8f5e-854c-9f5c-faf1e16c22be Address:127.0.0.1:41506}]
2019/12/06 06:12:25 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
TestPolicyUpdateCommand - 2019/12/06 06:12:25.536724 [INFO] serf: EventMemberJoin: Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be.dc1 127.0.0.1
TestPolicyUpdateCommand - 2019/12/06 06:12:25.541377 [INFO] serf: EventMemberJoin: Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be 127.0.0.1
TestPolicyUpdateCommand - 2019/12/06 06:12:25.542770 [INFO] consul: Adding LAN server Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be (Addr: tcp/127.0.0.1:41506) (DC: dc1)
TestPolicyUpdateCommand - 2019/12/06 06:12:25.543172 [INFO] consul: Handled member-join event for server "Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be.dc1" in area "wan"
TestPolicyUpdateCommand - 2019/12/06 06:12:25.543556 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
TestPolicyUpdateCommand - 2019/12/06 06:12:25.543865 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
TestPolicyUpdateCommand - 2019/12/06 06:12:25.546731 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
TestPolicyUpdateCommand - 2019/12/06 06:12:25.546942 [INFO] agent: started state syncer
2019/12/06 06:12:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:12:25 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
2019/12/06 06:12:26 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:12:26 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
TestPolicyUpdateCommand - 2019/12/06 06:12:26.154725 [INFO] consul: cluster leadership acquired
TestPolicyUpdateCommand - 2019/12/06 06:12:26.155358 [INFO] consul: New leader elected: Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be
TestPolicyUpdateCommand - 2019/12/06 06:12:26.181279 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyUpdateCommand - 2019/12/06 06:12:26.294390 [INFO] acl: initializing acls
TestPolicyUpdateCommand - 2019/12/06 06:12:26.627768 [INFO] acl: initializing acls
TestPolicyUpdateCommand - 2019/12/06 06:12:26.632496 [INFO] consul: Created ACL 'global-management' policy
TestPolicyUpdateCommand - 2019/12/06 06:12:26.632572 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyUpdateCommand - 2019/12/06 06:12:26.778720 [INFO] consul: Created ACL 'global-management' policy
TestPolicyUpdateCommand - 2019/12/06 06:12:26.778816 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyUpdateCommand - 2019/12/06 06:12:27.036950 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyUpdateCommand - 2019/12/06 06:12:27.235569 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyUpdateCommand - 2019/12/06 06:12:27.544842 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyUpdateCommand - 2019/12/06 06:12:27.544978 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyUpdateCommand - 2019/12/06 06:12:27.546107 [INFO] serf: EventMemberUpdate: Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be
TestPolicyUpdateCommand - 2019/12/06 06:12:27.546865 [INFO] serf: EventMemberUpdate: Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be.dc1
TestPolicyUpdateCommand - 2019/12/06 06:12:27.852882 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyUpdateCommand - 2019/12/06 06:12:27.853839 [INFO] serf: EventMemberUpdate: Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be
TestPolicyUpdateCommand - 2019/12/06 06:12:27.855426 [INFO] serf: EventMemberUpdate: Node 3313fe5c-8f5e-854c-9f5c-faf1e16c22be.dc1
TestPolicyUpdateCommand - 2019/12/06 06:12:28.469635 [INFO] agent: Synced node info
TestPolicyUpdateCommand - 2019/12/06 06:12:28.469763 [DEBUG] agent: Node info in sync
TestPolicyUpdateCommand - 2019/12/06 06:12:29.180644 [DEBUG] http: Request PUT /v1/acl/policy (689.842664ms) from=127.0.0.1:36202
TestPolicyUpdateCommand - 2019/12/06 06:12:29.190573 [DEBUG] http: Request GET /v1/acl/policy/2a53d6eb-15e6-31f0-5ad3-c59c0e79d548 (1.680706ms) from=127.0.0.1:36210
TestPolicyUpdateCommand - 2019/12/06 06:12:29.954538 [DEBUG] http: Request PUT /v1/acl/policy/2a53d6eb-15e6-31f0-5ad3-c59c0e79d548 (745.795962ms) from=127.0.0.1:36210
TestPolicyUpdateCommand - 2019/12/06 06:12:29.958322 [INFO] agent: Requesting shutdown
TestPolicyUpdateCommand - 2019/12/06 06:12:29.958409 [INFO] consul: shutting down server
TestPolicyUpdateCommand - 2019/12/06 06:12:29.958459 [WARN] serf: Shutdown without a Leave
TestPolicyUpdateCommand - 2019/12/06 06:12:30.360025 [WARN] serf: Shutdown without a Leave
TestPolicyUpdateCommand - 2019/12/06 06:12:30.510036 [INFO] manager: shutting down
TestPolicyUpdateCommand - 2019/12/06 06:12:30.643500 [ERR] connect: Apply failed leadership lost while committing log
TestPolicyUpdateCommand - 2019/12/06 06:12:30.643598 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestPolicyUpdateCommand - 2019/12/06 06:12:30.645081 [INFO] agent: consul server down
TestPolicyUpdateCommand - 2019/12/06 06:12:30.645443 [INFO] agent: shutdown complete
TestPolicyUpdateCommand - 2019/12/06 06:12:30.645654 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
TestPolicyUpdateCommand - 2019/12/06 06:12:30.646331 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
TestPolicyUpdateCommand - 2019/12/06 06:12:30.646828 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
TestPolicyUpdateCommand - 2019/12/06 06:12:30.647694 [INFO] agent: Waiting for endpoints to shut down
TestPolicyUpdateCommand - 2019/12/06 06:12:30.648031 [INFO] agent: Endpoints down
--- PASS: TestPolicyUpdateCommand (8.54s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/update	8.993s
?   	github.com/hashicorp/consul/command/acl/role	[no test files]
=== RUN   TestRoleCreateCommand_noTabs
=== PAUSE TestRoleCreateCommand_noTabs
=== RUN   TestRoleCreateCommand
=== PAUSE TestRoleCreateCommand
=== CONT  TestRoleCreateCommand_noTabs
--- PASS: TestRoleCreateCommand_noTabs (0.00s)
=== CONT  TestRoleCreateCommand
WARNING: bootstrap = true: do not enable unless necessary
TestRoleCreateCommand - 2019/12/06 06:12:22.483200 [WARN] agent: Node name "Node d92e996a-0d57-cd8f-5933-f28b491f8788" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleCreateCommand - 2019/12/06 06:12:22.625544 [DEBUG] tlsutil: Update with version 1
TestRoleCreateCommand - 2019/12/06 06:12:22.648904 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:12:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d92e996a-0d57-cd8f-5933-f28b491f8788 Address:127.0.0.1:52006}]
2019/12/06 06:12:24 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestRoleCreateCommand - 2019/12/06 06:12:24.824324 [INFO] serf: EventMemberJoin: Node d92e996a-0d57-cd8f-5933-f28b491f8788.dc1 127.0.0.1
TestRoleCreateCommand - 2019/12/06 06:12:24.828679 [INFO] serf: EventMemberJoin: Node d92e996a-0d57-cd8f-5933-f28b491f8788 127.0.0.1
TestRoleCreateCommand - 2019/12/06 06:12:24.830406 [INFO] consul: Handled member-join event for server "Node d92e996a-0d57-cd8f-5933-f28b491f8788.dc1" in area "wan"
TestRoleCreateCommand - 2019/12/06 06:12:24.830956 [INFO] consul: Adding LAN server Node d92e996a-0d57-cd8f-5933-f28b491f8788 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestRoleCreateCommand - 2019/12/06 06:12:24.830981 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestRoleCreateCommand - 2019/12/06 06:12:24.831362 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestRoleCreateCommand - 2019/12/06 06:12:24.834482 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestRoleCreateCommand - 2019/12/06 06:12:24.834710 [INFO] agent: started state syncer
2019/12/06 06:12:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:12:24 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/12/06 06:12:25 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:12:25 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestRoleCreateCommand - 2019/12/06 06:12:25.935866 [INFO] consul: cluster leadership acquired
TestRoleCreateCommand - 2019/12/06 06:12:25.936668 [INFO] consul: New leader elected: Node d92e996a-0d57-cd8f-5933-f28b491f8788
TestRoleCreateCommand - 2019/12/06 06:12:26.059355 [ERR] agent: failed to sync remote state: ACL not found
TestRoleCreateCommand - 2019/12/06 06:12:26.243575 [INFO] acl: initializing acls
TestRoleCreateCommand - 2019/12/06 06:12:26.381697 [INFO] acl: initializing acls
TestRoleCreateCommand - 2019/12/06 06:12:26.707038 [INFO] consul: Created ACL 'global-management' policy
TestRoleCreateCommand - 2019/12/06 06:12:26.707139 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleCreateCommand - 2019/12/06 06:12:26.707399 [INFO] consul: Created ACL 'global-management' policy
TestRoleCreateCommand - 2019/12/06 06:12:26.707460 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleCreateCommand - 2019/12/06 06:12:27.120000 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleCreateCommand - 2019/12/06 06:12:27.123679 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleCreateCommand - 2019/12/06 06:12:27.344454 [INFO] consul: Created ACL anonymous token from configuration
TestRoleCreateCommand - 2019/12/06 06:12:27.344808 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleCreateCommand - 2019/12/06 06:12:27.345828 [INFO] serf: EventMemberUpdate: Node d92e996a-0d57-cd8f-5933-f28b491f8788
TestRoleCreateCommand - 2019/12/06 06:12:27.347314 [INFO] serf: EventMemberUpdate: Node d92e996a-0d57-cd8f-5933-f28b491f8788.dc1
TestRoleCreateCommand - 2019/12/06 06:12:27.713355 [INFO] consul: Created ACL anonymous token from configuration
TestRoleCreateCommand - 2019/12/06 06:12:27.714389 [INFO] serf: EventMemberUpdate: Node d92e996a-0d57-cd8f-5933-f28b491f8788
TestRoleCreateCommand - 2019/12/06 06:12:27.715289 [INFO] serf: EventMemberUpdate: Node d92e996a-0d57-cd8f-5933-f28b491f8788.dc1
TestRoleCreateCommand - 2019/12/06 06:12:28.361186 [INFO] agent: Synced node info
TestRoleCreateCommand - 2019/12/06 06:12:28.361318 [DEBUG] agent: Node info in sync
TestRoleCreateCommand - 2019/12/06 06:12:28.872112 [DEBUG] http: Request PUT /v1/acl/policy (485.555594ms) from=127.0.0.1:36742
TestRoleCreateCommand - 2019/12/06 06:12:29.531373 [DEBUG] http: Request PUT /v1/acl/role (651.113766ms) from=127.0.0.1:36746
TestRoleCreateCommand - 2019/12/06 06:12:29.965625 [DEBUG] http: Request PUT /v1/acl/role (430.144642ms) from=127.0.0.1:36756
TestRoleCreateCommand - 2019/12/06 06:12:30.643959 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleCreateCommand - 2019/12/06 06:12:30.645915 [DEBUG] http: Request PUT /v1/acl/role (673.024274ms) from=127.0.0.1:36758
TestRoleCreateCommand - 2019/12/06 06:12:31.144854 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleCreateCommand - 2019/12/06 06:12:31.576873 [DEBUG] consul: Skipping self join check for "Node d92e996a-0d57-cd8f-5933-f28b491f8788" since the cluster is too small
TestRoleCreateCommand - 2019/12/06 06:12:31.577105 [INFO] consul: member 'Node d92e996a-0d57-cd8f-5933-f28b491f8788' joined, marking health alive
TestRoleCreateCommand - 2019/12/06 06:12:31.578519 [DEBUG] http: Request PUT /v1/acl/role (924.51944ms) from=127.0.0.1:36760
TestRoleCreateCommand - 2019/12/06 06:12:31.581345 [INFO] agent: Requesting shutdown
TestRoleCreateCommand - 2019/12/06 06:12:31.581445 [INFO] consul: shutting down server
TestRoleCreateCommand - 2019/12/06 06:12:31.581501 [WARN] serf: Shutdown without a Leave
TestRoleCreateCommand - 2019/12/06 06:12:31.926647 [WARN] serf: Shutdown without a Leave
TestRoleCreateCommand - 2019/12/06 06:12:32.043442 [INFO] manager: shutting down
TestRoleCreateCommand - 2019/12/06 06:12:32.046219 [INFO] agent: consul server down
TestRoleCreateCommand - 2019/12/06 06:12:32.046297 [INFO] agent: shutdown complete
TestRoleCreateCommand - 2019/12/06 06:12:32.046361 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestRoleCreateCommand - 2019/12/06 06:12:32.046509 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestRoleCreateCommand - 2019/12/06 06:12:32.046693 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestRoleCreateCommand - 2019/12/06 06:12:32.047948 [INFO] agent: Waiting for endpoints to shut down
TestRoleCreateCommand - 2019/12/06 06:12:32.048042 [INFO] agent: Endpoints down
--- PASS: TestRoleCreateCommand (9.93s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/create	11.805s
=== RUN   TestRoleDeleteCommand_noTabs
=== PAUSE TestRoleDeleteCommand_noTabs
=== RUN   TestRoleDeleteCommand
=== PAUSE TestRoleDeleteCommand
=== CONT  TestRoleDeleteCommand_noTabs
=== CONT  TestRoleDeleteCommand
--- PASS: TestRoleDeleteCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleDeleteCommand - 2019/12/06 06:14:33.878008 [WARN] agent: Node name "Node 0a47b6d2-d5d7-2da2-db01-265a7075d743" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleDeleteCommand - 2019/12/06 06:14:33.966021 [DEBUG] tlsutil: Update with version 1
TestRoleDeleteCommand - 2019/12/06 06:14:33.983318 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:14:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0a47b6d2-d5d7-2da2-db01-265a7075d743 Address:127.0.0.1:31006}]
2019/12/06 06:14:35 [INFO]  raft: Node at 127.0.0.1:31006 [Follower] entering Follower state (Leader: "")
TestRoleDeleteCommand - 2019/12/06 06:14:35.667461 [INFO] serf: EventMemberJoin: Node 0a47b6d2-d5d7-2da2-db01-265a7075d743.dc1 127.0.0.1
TestRoleDeleteCommand - 2019/12/06 06:14:35.671657 [INFO] serf: EventMemberJoin: Node 0a47b6d2-d5d7-2da2-db01-265a7075d743 127.0.0.1
TestRoleDeleteCommand - 2019/12/06 06:14:35.673069 [INFO] consul: Handled member-join event for server "Node 0a47b6d2-d5d7-2da2-db01-265a7075d743.dc1" in area "wan"
TestRoleDeleteCommand - 2019/12/06 06:14:35.673564 [INFO] consul: Adding LAN server Node 0a47b6d2-d5d7-2da2-db01-265a7075d743 (Addr: tcp/127.0.0.1:31006) (DC: dc1)
TestRoleDeleteCommand - 2019/12/06 06:14:35.673636 [INFO] agent: Started DNS server 127.0.0.1:31001 (udp)
TestRoleDeleteCommand - 2019/12/06 06:14:35.674079 [INFO] agent: Started DNS server 127.0.0.1:31001 (tcp)
TestRoleDeleteCommand - 2019/12/06 06:14:35.676981 [INFO] agent: Started HTTP server on 127.0.0.1:31002 (tcp)
TestRoleDeleteCommand - 2019/12/06 06:14:35.677130 [INFO] agent: started state syncer
2019/12/06 06:14:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:14:35 [INFO]  raft: Node at 127.0.0.1:31006 [Candidate] entering Candidate state in term 2
2019/12/06 06:14:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:14:36 [INFO]  raft: Node at 127.0.0.1:31006 [Leader] entering Leader state
TestRoleDeleteCommand - 2019/12/06 06:14:36.604150 [INFO] consul: cluster leadership acquired
TestRoleDeleteCommand - 2019/12/06 06:14:36.604762 [INFO] consul: New leader elected: Node 0a47b6d2-d5d7-2da2-db01-265a7075d743
TestRoleDeleteCommand - 2019/12/06 06:14:36.708305 [ERR] agent: failed to sync remote state: ACL not found
TestRoleDeleteCommand - 2019/12/06 06:14:37.038992 [INFO] acl: initializing acls
TestRoleDeleteCommand - 2019/12/06 06:14:37.215945 [INFO] consul: Created ACL 'global-management' policy
TestRoleDeleteCommand - 2019/12/06 06:14:37.216025 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleDeleteCommand - 2019/12/06 06:14:37.225183 [INFO] acl: initializing acls
TestRoleDeleteCommand - 2019/12/06 06:14:37.225316 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleDeleteCommand - 2019/12/06 06:14:37.897087 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleDeleteCommand - 2019/12/06 06:14:37.900475 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleDeleteCommand - 2019/12/06 06:14:38.097056 [INFO] consul: Created ACL anonymous token from configuration
TestRoleDeleteCommand - 2019/12/06 06:14:38.098177 [INFO] serf: EventMemberUpdate: Node 0a47b6d2-d5d7-2da2-db01-265a7075d743
TestRoleDeleteCommand - 2019/12/06 06:14:38.098817 [INFO] serf: EventMemberUpdate: Node 0a47b6d2-d5d7-2da2-db01-265a7075d743.dc1
TestRoleDeleteCommand - 2019/12/06 06:14:38.298320 [INFO] consul: Created ACL anonymous token from configuration
TestRoleDeleteCommand - 2019/12/06 06:14:38.298409 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleDeleteCommand - 2019/12/06 06:14:38.299356 [INFO] serf: EventMemberUpdate: Node 0a47b6d2-d5d7-2da2-db01-265a7075d743
TestRoleDeleteCommand - 2019/12/06 06:14:38.300070 [INFO] serf: EventMemberUpdate: Node 0a47b6d2-d5d7-2da2-db01-265a7075d743.dc1
TestRoleDeleteCommand - 2019/12/06 06:14:38.921538 [INFO] agent: Synced node info
TestRoleDeleteCommand - 2019/12/06 06:14:38.921685 [DEBUG] agent: Node info in sync
=== RUN   TestRoleDeleteCommand/id_or_name_required
=== RUN   TestRoleDeleteCommand/delete_works
TestRoleDeleteCommand - 2019/12/06 06:14:39.672645 [DEBUG] http: Request PUT /v1/acl/role (715.413594ms) from=127.0.0.1:60260
TestRoleDeleteCommand - 2019/12/06 06:14:40.342519 [DEBUG] http: Request DELETE /v1/acl/role/3b37da50-a484-db5d-6019-21a279b21398 (660.906329ms) from=127.0.0.1:60262
TestRoleDeleteCommand - 2019/12/06 06:14:40.346510 [DEBUG] http: Request GET /v1/acl/role/3b37da50-a484-db5d-6019-21a279b21398 (527.012µs) from=127.0.0.1:60260
=== RUN   TestRoleDeleteCommand/delete_works_via_prefixes
TestRoleDeleteCommand - 2019/12/06 06:14:40.954087 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleDeleteCommand - 2019/12/06 06:14:40.956157 [DEBUG] http: Request PUT /v1/acl/role (607.077082ms) from=127.0.0.1:60260
TestRoleDeleteCommand - 2019/12/06 06:14:40.963949 [DEBUG] http: Request GET /v1/acl/roles (1.86471ms) from=127.0.0.1:60274
TestRoleDeleteCommand - 2019/12/06 06:14:41.572035 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleDeleteCommand - 2019/12/06 06:14:41.572688 [DEBUG] consul: Skipping self join check for "Node 0a47b6d2-d5d7-2da2-db01-265a7075d743" since the cluster is too small
TestRoleDeleteCommand - 2019/12/06 06:14:41.572899 [DEBUG] http: Request DELETE /v1/acl/role/cb557990-ec68-6fbe-0033-ff1299824670 (604.688359ms) from=127.0.0.1:60274
TestRoleDeleteCommand - 2019/12/06 06:14:41.572983 [INFO] consul: member 'Node 0a47b6d2-d5d7-2da2-db01-265a7075d743' joined, marking health alive
TestRoleDeleteCommand - 2019/12/06 06:14:41.580458 [DEBUG] http: Request GET /v1/acl/role/cb557990-ec68-6fbe-0033-ff1299824670 (448.344µs) from=127.0.0.1:60260
TestRoleDeleteCommand - 2019/12/06 06:14:41.581570 [INFO] agent: Requesting shutdown
TestRoleDeleteCommand - 2019/12/06 06:14:41.581673 [INFO] consul: shutting down server
TestRoleDeleteCommand - 2019/12/06 06:14:41.581753 [WARN] serf: Shutdown without a Leave
TestRoleDeleteCommand - 2019/12/06 06:14:41.995601 [WARN] serf: Shutdown without a Leave
TestRoleDeleteCommand - 2019/12/06 06:14:42.153949 [INFO] manager: shutting down
TestRoleDeleteCommand - 2019/12/06 06:14:42.154941 [INFO] agent: consul server down
TestRoleDeleteCommand - 2019/12/06 06:14:42.154999 [INFO] agent: shutdown complete
TestRoleDeleteCommand - 2019/12/06 06:14:42.155056 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (tcp)
TestRoleDeleteCommand - 2019/12/06 06:14:42.155193 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (udp)
TestRoleDeleteCommand - 2019/12/06 06:14:42.155341 [INFO] agent: Stopping HTTP server 127.0.0.1:31002 (tcp)
TestRoleDeleteCommand - 2019/12/06 06:14:42.155501 [ERR] consul: failed to reconcile member: {Node 0a47b6d2-d5d7-2da2-db01-265a7075d743 127.0.0.1 31004 map[acls:1 bootstrap:1 build:1.5.2: dc:dc1 id:0a47b6d2-d5d7-2da2-db01-265a7075d743 port:31006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:31005] alive 1 5 2 2 5 4}: leadership lost while committing log
TestRoleDeleteCommand - 2019/12/06 06:14:42.155810 [INFO] agent: Waiting for endpoints to shut down
TestRoleDeleteCommand - 2019/12/06 06:14:42.155885 [INFO] agent: Endpoints down
--- PASS: TestRoleDeleteCommand (8.42s)
    --- PASS: TestRoleDeleteCommand/id_or_name_required (0.00s)
    --- PASS: TestRoleDeleteCommand/delete_works (1.40s)
    --- PASS: TestRoleDeleteCommand/delete_works_via_prefixes (1.23s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/delete	9.317s
=== RUN   TestRoleListCommand_noTabs
=== PAUSE TestRoleListCommand_noTabs
=== RUN   TestRoleListCommand
=== PAUSE TestRoleListCommand
=== CONT  TestRoleListCommand_noTabs
=== CONT  TestRoleListCommand
--- PASS: TestRoleListCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleListCommand - 2019/12/06 06:14:33.876992 [WARN] agent: Node name "Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleListCommand - 2019/12/06 06:14:33.951410 [DEBUG] tlsutil: Update with version 1
TestRoleListCommand - 2019/12/06 06:14:33.957856 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:14:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2b8b83aa-6a5c-b2a3-6355-e840fc43af21 Address:127.0.0.1:29506}]
2019/12/06 06:14:35 [INFO]  raft: Node at 127.0.0.1:29506 [Follower] entering Follower state (Leader: "")
TestRoleListCommand - 2019/12/06 06:14:35.391980 [INFO] serf: EventMemberJoin: Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21.dc1 127.0.0.1
TestRoleListCommand - 2019/12/06 06:14:35.401016 [INFO] serf: EventMemberJoin: Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21 127.0.0.1
TestRoleListCommand - 2019/12/06 06:14:35.402292 [INFO] consul: Adding LAN server Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21 (Addr: tcp/127.0.0.1:29506) (DC: dc1)
TestRoleListCommand - 2019/12/06 06:14:35.403136 [INFO] consul: Handled member-join event for server "Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21.dc1" in area "wan"
TestRoleListCommand - 2019/12/06 06:14:35.403928 [INFO] agent: Started DNS server 127.0.0.1:29501 (tcp)
TestRoleListCommand - 2019/12/06 06:14:35.405963 [INFO] agent: Started DNS server 127.0.0.1:29501 (udp)
TestRoleListCommand - 2019/12/06 06:14:35.408623 [INFO] agent: Started HTTP server on 127.0.0.1:29502 (tcp)
TestRoleListCommand - 2019/12/06 06:14:35.408774 [INFO] agent: started state syncer
2019/12/06 06:14:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:14:35 [INFO]  raft: Node at 127.0.0.1:29506 [Candidate] entering Candidate state in term 2
2019/12/06 06:14:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:14:36 [INFO]  raft: Node at 127.0.0.1:29506 [Leader] entering Leader state
TestRoleListCommand - 2019/12/06 06:14:36.363793 [INFO] consul: cluster leadership acquired
TestRoleListCommand - 2019/12/06 06:14:36.364441 [INFO] consul: New leader elected: Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21
TestRoleListCommand - 2019/12/06 06:14:36.465148 [ERR] agent: failed to sync remote state: ACL not found
TestRoleListCommand - 2019/12/06 06:14:36.845729 [INFO] acl: initializing acls
TestRoleListCommand - 2019/12/06 06:14:36.956542 [INFO] acl: initializing acls
TestRoleListCommand - 2019/12/06 06:14:37.206255 [INFO] consul: Created ACL 'global-management' policy
TestRoleListCommand - 2019/12/06 06:14:37.206353 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleListCommand - 2019/12/06 06:14:37.209130 [INFO] consul: Created ACL 'global-management' policy
TestRoleListCommand - 2019/12/06 06:14:37.209299 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleListCommand - 2019/12/06 06:14:37.705324 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleListCommand - 2019/12/06 06:14:37.705664 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleListCommand - 2019/12/06 06:14:37.896746 [INFO] consul: Created ACL anonymous token from configuration
TestRoleListCommand - 2019/12/06 06:14:37.897779 [INFO] serf: EventMemberUpdate: Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21
TestRoleListCommand - 2019/12/06 06:14:37.898489 [INFO] serf: EventMemberUpdate: Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21.dc1
TestRoleListCommand - 2019/12/06 06:14:38.592165 [INFO] consul: Created ACL anonymous token from configuration
TestRoleListCommand - 2019/12/06 06:14:38.592235 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleListCommand - 2019/12/06 06:14:38.592612 [INFO] agent: Synced node info
TestRoleListCommand - 2019/12/06 06:14:38.592818 [DEBUG] agent: Node info in sync
TestRoleListCommand - 2019/12/06 06:14:38.593129 [INFO] serf: EventMemberUpdate: Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21
TestRoleListCommand - 2019/12/06 06:14:38.598524 [INFO] serf: EventMemberUpdate: Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21.dc1
TestRoleListCommand - 2019/12/06 06:14:39.665308 [DEBUG] http: Request PUT /v1/acl/role (1.038390419s) from=127.0.0.1:60790
TestRoleListCommand - 2019/12/06 06:14:40.111773 [DEBUG] http: Request PUT /v1/acl/role (441.846916ms) from=127.0.0.1:60790
TestRoleListCommand - 2019/12/06 06:14:40.821154 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleListCommand - 2019/12/06 06:14:40.825546 [DEBUG] http: Request PUT /v1/acl/role (710.213141ms) from=127.0.0.1:60790
TestRoleListCommand - 2019/12/06 06:14:41.480021 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleListCommand - 2019/12/06 06:14:41.480471 [DEBUG] consul: Skipping self join check for "Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21" since the cluster is too small
TestRoleListCommand - 2019/12/06 06:14:41.480640 [INFO] consul: member 'Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21' joined, marking health alive
TestRoleListCommand - 2019/12/06 06:14:41.481408 [DEBUG] http: Request PUT /v1/acl/role (645.210966ms) from=127.0.0.1:60790
TestRoleListCommand - 2019/12/06 06:14:42.154256 [DEBUG] consul: Skipping self join check for "Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21" since the cluster is too small
TestRoleListCommand - 2019/12/06 06:14:42.154929 [DEBUG] consul: Skipping self join check for "Node 2b8b83aa-6a5c-b2a3-6355-e840fc43af21" since the cluster is too small
TestRoleListCommand - 2019/12/06 06:14:42.157638 [DEBUG] http: Request PUT /v1/acl/role (671.078899ms) from=127.0.0.1:60790
TestRoleListCommand - 2019/12/06 06:14:42.164713 [DEBUG] http: Request GET /v1/acl/roles (1.866377ms) from=127.0.0.1:60810
TestRoleListCommand - 2019/12/06 06:14:42.171282 [INFO] agent: Requesting shutdown
TestRoleListCommand - 2019/12/06 06:14:42.171398 [INFO] consul: shutting down server
TestRoleListCommand - 2019/12/06 06:14:42.171451 [WARN] serf: Shutdown without a Leave
TestRoleListCommand - 2019/12/06 06:14:42.345412 [WARN] serf: Shutdown without a Leave
TestRoleListCommand - 2019/12/06 06:14:42.553967 [INFO] manager: shutting down
TestRoleListCommand - 2019/12/06 06:14:42.554757 [INFO] agent: consul server down
TestRoleListCommand - 2019/12/06 06:14:42.554825 [INFO] agent: shutdown complete
TestRoleListCommand - 2019/12/06 06:14:42.554890 [INFO] agent: Stopping DNS server 127.0.0.1:29501 (tcp)
TestRoleListCommand - 2019/12/06 06:14:42.555035 [INFO] agent: Stopping DNS server 127.0.0.1:29501 (udp)
TestRoleListCommand - 2019/12/06 06:14:42.555187 [INFO] agent: Stopping HTTP server 127.0.0.1:29502 (tcp)
TestRoleListCommand - 2019/12/06 06:14:42.556052 [INFO] agent: Waiting for endpoints to shut down
TestRoleListCommand - 2019/12/06 06:14:42.556149 [INFO] agent: Endpoints down
--- PASS: TestRoleListCommand (8.82s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/list	9.921s
=== RUN   TestRoleReadCommand_noTabs
=== PAUSE TestRoleReadCommand_noTabs
=== RUN   TestRoleReadCommand
=== PAUSE TestRoleReadCommand
=== CONT  TestRoleReadCommand_noTabs
=== CONT  TestRoleReadCommand
--- PASS: TestRoleReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleReadCommand - 2019/12/06 06:14:33.879628 [WARN] agent: Node name "Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleReadCommand - 2019/12/06 06:14:33.961509 [DEBUG] tlsutil: Update with version 1
TestRoleReadCommand - 2019/12/06 06:14:33.988709 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:14:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bbf17b0d-fadc-75c4-ee0e-77a963528cf0 Address:127.0.0.1:41506}]
2019/12/06 06:14:35 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
TestRoleReadCommand - 2019/12/06 06:14:35.669265 [INFO] serf: EventMemberJoin: Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0.dc1 127.0.0.1
TestRoleReadCommand - 2019/12/06 06:14:35.673478 [INFO] serf: EventMemberJoin: Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0 127.0.0.1
TestRoleReadCommand - 2019/12/06 06:14:35.674988 [INFO] consul: Adding LAN server Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0 (Addr: tcp/127.0.0.1:41506) (DC: dc1)
TestRoleReadCommand - 2019/12/06 06:14:35.675366 [INFO] consul: Handled member-join event for server "Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0.dc1" in area "wan"
TestRoleReadCommand - 2019/12/06 06:14:35.676536 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
TestRoleReadCommand - 2019/12/06 06:14:35.676995 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
TestRoleReadCommand - 2019/12/06 06:14:35.679681 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
TestRoleReadCommand - 2019/12/06 06:14:35.679871 [INFO] agent: started state syncer
2019/12/06 06:14:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:14:35 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
2019/12/06 06:14:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:14:36 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
TestRoleReadCommand - 2019/12/06 06:14:36.604630 [INFO] consul: cluster leadership acquired
TestRoleReadCommand - 2019/12/06 06:14:36.605336 [INFO] consul: New leader elected: Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0
TestRoleReadCommand - 2019/12/06 06:14:36.617500 [ERR] agent: failed to sync remote state: ACL not found
TestRoleReadCommand - 2019/12/06 06:14:37.039725 [INFO] acl: initializing acls
TestRoleReadCommand - 2019/12/06 06:14:37.206757 [INFO] consul: Created ACL 'global-management' policy
TestRoleReadCommand - 2019/12/06 06:14:37.207033 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleReadCommand - 2019/12/06 06:14:37.228108 [INFO] acl: initializing acls
TestRoleReadCommand - 2019/12/06 06:14:37.228237 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleReadCommand - 2019/12/06 06:14:37.239898 [ERR] agent: failed to sync remote state: ACL not found
TestRoleReadCommand - 2019/12/06 06:14:37.799229 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleReadCommand - 2019/12/06 06:14:37.799297 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleReadCommand - 2019/12/06 06:14:37.992292 [INFO] consul: Created ACL anonymous token from configuration
TestRoleReadCommand - 2019/12/06 06:14:37.992608 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleReadCommand - 2019/12/06 06:14:37.992937 [INFO] consul: Created ACL anonymous token from configuration
TestRoleReadCommand - 2019/12/06 06:14:37.993707 [INFO] serf: EventMemberUpdate: Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0
TestRoleReadCommand - 2019/12/06 06:14:37.994341 [INFO] serf: EventMemberUpdate: Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0.dc1
TestRoleReadCommand - 2019/12/06 06:14:37.995480 [INFO] serf: EventMemberUpdate: Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0
TestRoleReadCommand - 2019/12/06 06:14:37.996775 [INFO] serf: EventMemberUpdate: Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0.dc1
TestRoleReadCommand - 2019/12/06 06:14:39.447053 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleReadCommand - 2019/12/06 06:14:39.447674 [DEBUG] consul: Skipping self join check for "Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0" since the cluster is too small
TestRoleReadCommand - 2019/12/06 06:14:39.447808 [INFO] consul: member 'Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0' joined, marking health alive
TestRoleReadCommand - 2019/12/06 06:14:39.933205 [DEBUG] consul: Skipping self join check for "Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0" since the cluster is too small
TestRoleReadCommand - 2019/12/06 06:14:39.933752 [DEBUG] consul: Skipping self join check for "Node bbf17b0d-fadc-75c4-ee0e-77a963528cf0" since the cluster is too small
=== RUN   TestRoleReadCommand/id_or_name_required
=== RUN   TestRoleReadCommand/read_by_id_not_found
TestRoleReadCommand - 2019/12/06 06:14:39.952394 [DEBUG] http: Request GET /v1/acl/role/509232fb-676a-bf6b-54ce-20fbca6ac5b9 (2.852066ms) from=127.0.0.1:36230
=== RUN   TestRoleReadCommand/read_by_name_not_found
TestRoleReadCommand - 2019/12/06 06:14:39.958911 [DEBUG] http: Request GET /v1/acl/role/name/blah (598.681µs) from=127.0.0.1:36232
=== RUN   TestRoleReadCommand/read_by_id
TestRoleReadCommand - 2019/12/06 06:14:40.456986 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleReadCommand - 2019/12/06 06:14:40.465738 [DEBUG] http: Request PUT /v1/acl/role (502.279317ms) from=127.0.0.1:36234
TestRoleReadCommand - 2019/12/06 06:14:40.478038 [DEBUG] http: Request GET /v1/acl/role/53614fef-3045-3074-9332-1f0b9c9b3de1 (1.594037ms) from=127.0.0.1:36236
=== RUN   TestRoleReadCommand/read_by_id_prefix
TestRoleReadCommand - 2019/12/06 06:14:40.829237 [DEBUG] http: Request PUT /v1/acl/role (346.128029ms) from=127.0.0.1:36234
TestRoleReadCommand - 2019/12/06 06:14:40.840901 [DEBUG] http: Request GET /v1/acl/roles (1.877377ms) from=127.0.0.1:36238
TestRoleReadCommand - 2019/12/06 06:14:40.845844 [DEBUG] http: Request GET /v1/acl/role/137cade4-e57c-52f1-02f8-c36e285ffbc0 (1.108026ms) from=127.0.0.1:36238
=== RUN   TestRoleReadCommand/read_by_name
TestRoleReadCommand - 2019/12/06 06:14:41.217185 [DEBUG] http: Request PUT /v1/acl/role (368.252542ms) from=127.0.0.1:36234
TestRoleReadCommand - 2019/12/06 06:14:41.224869 [DEBUG] http: Request GET /v1/acl/role/name/test-role-by-name (1.4057ms) from=127.0.0.1:36242
TestRoleReadCommand - 2019/12/06 06:14:41.226878 [INFO] agent: Requesting shutdown
TestRoleReadCommand - 2019/12/06 06:14:41.226959 [INFO] consul: shutting down server
TestRoleReadCommand - 2019/12/06 06:14:41.227005 [WARN] serf: Shutdown without a Leave
TestRoleReadCommand - 2019/12/06 06:14:41.478715 [WARN] serf: Shutdown without a Leave
TestRoleReadCommand - 2019/12/06 06:14:41.570561 [INFO] manager: shutting down
TestRoleReadCommand - 2019/12/06 06:14:41.571303 [INFO] agent: consul server down
TestRoleReadCommand - 2019/12/06 06:14:41.571367 [INFO] agent: shutdown complete
TestRoleReadCommand - 2019/12/06 06:14:41.571426 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
TestRoleReadCommand - 2019/12/06 06:14:41.571622 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
TestRoleReadCommand - 2019/12/06 06:14:41.571849 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
TestRoleReadCommand - 2019/12/06 06:14:41.575681 [INFO] agent: Waiting for endpoints to shut down
TestRoleReadCommand - 2019/12/06 06:14:41.575851 [INFO] agent: Endpoints down
--- PASS: TestRoleReadCommand (7.84s)
    --- PASS: TestRoleReadCommand/id_or_name_required (0.00s)
    --- PASS: TestRoleReadCommand/read_by_id_not_found (0.01s)
    --- PASS: TestRoleReadCommand/read_by_name_not_found (0.01s)
    --- PASS: TestRoleReadCommand/read_by_id (0.52s)
    --- PASS: TestRoleReadCommand/read_by_id_prefix (0.37s)
    --- PASS: TestRoleReadCommand/read_by_name (0.38s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/read	8.830s
=== RUN   TestRoleUpdateCommand_noTabs
=== PAUSE TestRoleUpdateCommand_noTabs
=== RUN   TestRoleUpdateCommand
=== PAUSE TestRoleUpdateCommand
=== RUN   TestRoleUpdateCommand_noMerge
=== PAUSE TestRoleUpdateCommand_noMerge
=== CONT  TestRoleUpdateCommand_noTabs
=== CONT  TestRoleUpdateCommand_noMerge
=== CONT  TestRoleUpdateCommand
--- PASS: TestRoleUpdateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:35.440645 [WARN] agent: Node name "Node 1296f61f-dd0c-dd39-18ba-4cb42062344e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:35.441521 [DEBUG] tlsutil: Update with version 1
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:35.447689 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestRoleUpdateCommand - 2019/12/06 06:14:35.468644 [WARN] agent: Node name "Node c96a2e22-6d52-1270-aece-311b3c8f5185" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRoleUpdateCommand - 2019/12/06 06:14:35.469063 [DEBUG] tlsutil: Update with version 1
TestRoleUpdateCommand - 2019/12/06 06:14:35.472916 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:14:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c96a2e22-6d52-1270-aece-311b3c8f5185 Address:127.0.0.1:10012}]
2019/12/06 06:14:37 [INFO]  raft: Node at 127.0.0.1:10012 [Follower] entering Follower state (Leader: "")
TestRoleUpdateCommand - 2019/12/06 06:14:37.212979 [INFO] serf: EventMemberJoin: Node c96a2e22-6d52-1270-aece-311b3c8f5185.dc1 127.0.0.1
TestRoleUpdateCommand - 2019/12/06 06:14:37.217993 [INFO] serf: EventMemberJoin: Node c96a2e22-6d52-1270-aece-311b3c8f5185 127.0.0.1
2019/12/06 06:14:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1296f61f-dd0c-dd39-18ba-4cb42062344e Address:127.0.0.1:10006}]
TestRoleUpdateCommand - 2019/12/06 06:14:37.223033 [INFO] agent: Started DNS server 127.0.0.1:10007 (udp)
2019/12/06 06:14:37 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestRoleUpdateCommand - 2019/12/06 06:14:37.228559 [INFO] consul: Adding LAN server Node c96a2e22-6d52-1270-aece-311b3c8f5185 (Addr: tcp/127.0.0.1:10012) (DC: dc1)
TestRoleUpdateCommand - 2019/12/06 06:14:37.229472 [INFO] agent: Started DNS server 127.0.0.1:10007 (tcp)
TestRoleUpdateCommand - 2019/12/06 06:14:37.231638 [INFO] consul: Handled member-join event for server "Node c96a2e22-6d52-1270-aece-311b3c8f5185.dc1" in area "wan"
TestRoleUpdateCommand - 2019/12/06 06:14:37.232003 [INFO] agent: Started HTTP server on 127.0.0.1:10008 (tcp)
TestRoleUpdateCommand - 2019/12/06 06:14:37.232139 [INFO] agent: started state syncer
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.232640 [INFO] serf: EventMemberJoin: Node 1296f61f-dd0c-dd39-18ba-4cb42062344e.dc1 127.0.0.1
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.236815 [INFO] serf: EventMemberJoin: Node 1296f61f-dd0c-dd39-18ba-4cb42062344e 127.0.0.1
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.237808 [INFO] consul: Adding LAN server Node 1296f61f-dd0c-dd39-18ba-4cb42062344e (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.237837 [INFO] consul: Handled member-join event for server "Node 1296f61f-dd0c-dd39-18ba-4cb42062344e.dc1" in area "wan"
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.238313 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.238375 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.240749 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.240858 [INFO] agent: started state syncer
2019/12/06 06:14:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:14:37 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/12/06 06:14:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:14:37 [INFO]  raft: Node at 127.0.0.1:10012 [Candidate] entering Candidate state in term 2
2019/12/06 06:14:37 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:14:37 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
2019/12/06 06:14:37 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:14:37 [INFO]  raft: Node at 127.0.0.1:10012 [Leader] entering Leader state
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.987727 [INFO] consul: cluster leadership acquired
TestRoleUpdateCommand - 2019/12/06 06:14:37.987982 [INFO] consul: cluster leadership acquired
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.988583 [INFO] consul: New leader elected: Node 1296f61f-dd0c-dd39-18ba-4cb42062344e
TestRoleUpdateCommand - 2019/12/06 06:14:37.988999 [INFO] consul: New leader elected: Node c96a2e22-6d52-1270-aece-311b3c8f5185
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:37.989589 [INFO] acl: initializing acls
TestRoleUpdateCommand - 2019/12/06 06:14:38.058749 [ERR] agent: failed to sync remote state: ACL not found
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:38.104794 [ERR] agent: failed to sync remote state: ACL not found
TestRoleUpdateCommand - 2019/12/06 06:14:38.429257 [INFO] acl: initializing acls
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:38.429278 [INFO] acl: initializing acls
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:38.432037 [INFO] consul: Created ACL 'global-management' policy
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:38.432174 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleUpdateCommand - 2019/12/06 06:14:38.757010 [INFO] consul: Created ACL 'global-management' policy
TestRoleUpdateCommand - 2019/12/06 06:14:38.757087 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleUpdateCommand - 2019/12/06 06:14:38.783249 [INFO] acl: initializing acls
TestRoleUpdateCommand - 2019/12/06 06:14:38.783472 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleUpdateCommand - 2019/12/06 06:14:39.097911 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:39.448674 [INFO] consul: Created ACL 'global-management' policy
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:39.448772 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:39.449898 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleUpdateCommand - 2019/12/06 06:14:39.662972 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleUpdateCommand - 2019/12/06 06:14:39.828771 [ERR] agent: failed to sync remote state: ACL not found
TestRoleUpdateCommand - 2019/12/06 06:14:39.933721 [INFO] consul: Created ACL anonymous token from configuration
TestRoleUpdateCommand - 2019/12/06 06:14:39.934707 [INFO] serf: EventMemberUpdate: Node c96a2e22-6d52-1270-aece-311b3c8f5185
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:39.935671 [INFO] consul: Bootstrapped ACL master token from configuration
TestRoleUpdateCommand - 2019/12/06 06:14:39.936231 [INFO] serf: EventMemberUpdate: Node c96a2e22-6d52-1270-aece-311b3c8f5185.dc1
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:40.399326 [ERR] agent: failed to sync remote state: ACL not found
TestRoleUpdateCommand - 2019/12/06 06:14:40.459323 [INFO] consul: Created ACL anonymous token from configuration
TestRoleUpdateCommand - 2019/12/06 06:14:40.459420 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleUpdateCommand - 2019/12/06 06:14:40.460519 [INFO] serf: EventMemberUpdate: Node c96a2e22-6d52-1270-aece-311b3c8f5185
TestRoleUpdateCommand - 2019/12/06 06:14:40.461394 [INFO] serf: EventMemberUpdate: Node c96a2e22-6d52-1270-aece-311b3c8f5185.dc1
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:40.822513 [INFO] consul: Created ACL anonymous token from configuration
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:40.822821 [INFO] consul: Created ACL anonymous token from configuration
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:40.822888 [DEBUG] acl: transitioning out of legacy ACL mode
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:40.823797 [INFO] serf: EventMemberUpdate: Node 1296f61f-dd0c-dd39-18ba-4cb42062344e
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:40.828796 [INFO] serf: EventMemberUpdate: Node 1296f61f-dd0c-dd39-18ba-4cb42062344e
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:40.830042 [INFO] serf: EventMemberUpdate: Node 1296f61f-dd0c-dd39-18ba-4cb42062344e.dc1
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:40.831063 [INFO] serf: EventMemberUpdate: Node 1296f61f-dd0c-dd39-18ba-4cb42062344e.dc1
TestRoleUpdateCommand - 2019/12/06 06:14:42.555664 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleUpdateCommand - 2019/12/06 06:14:42.556695 [DEBUG] consul: Skipping self join check for "Node c96a2e22-6d52-1270-aece-311b3c8f5185" since the cluster is too small
TestRoleUpdateCommand - 2019/12/06 06:14:42.556909 [INFO] consul: member 'Node c96a2e22-6d52-1270-aece-311b3c8f5185' joined, marking health alive
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:43.046450 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:43.046937 [DEBUG] consul: Skipping self join check for "Node 1296f61f-dd0c-dd39-18ba-4cb42062344e" since the cluster is too small
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:43.047024 [INFO] consul: member 'Node 1296f61f-dd0c-dd39-18ba-4cb42062344e' joined, marking health alive
TestRoleUpdateCommand - 2019/12/06 06:14:43.195729 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleUpdateCommand - 2019/12/06 06:14:43.198864 [DEBUG] consul: Skipping self join check for "Node c96a2e22-6d52-1270-aece-311b3c8f5185" since the cluster is too small
TestRoleUpdateCommand - 2019/12/06 06:14:43.199657 [DEBUG] consul: Skipping self join check for "Node c96a2e22-6d52-1270-aece-311b3c8f5185" since the cluster is too small
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:43.620817 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:43.623224 [DEBUG] consul: Skipping self join check for "Node 1296f61f-dd0c-dd39-18ba-4cb42062344e" since the cluster is too small
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:43.623669 [DEBUG] consul: Skipping self join check for "Node 1296f61f-dd0c-dd39-18ba-4cb42062344e" since the cluster is too small
TestRoleUpdateCommand - 2019/12/06 06:14:43.756117 [DEBUG] http: Request PUT /v1/acl/policy (533.282703ms) from=127.0.0.1:36732
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:44.214718 [DEBUG] http: Request PUT /v1/acl/policy (573.046626ms) from=127.0.0.1:46120
TestRoleUpdateCommand - 2019/12/06 06:14:44.398740 [DEBUG] http: Request PUT /v1/acl/policy (639.108492ms) from=127.0.0.1:36732
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:44.905525 [DEBUG] http: Request PUT /v1/acl/policy (688.246965ms) from=127.0.0.1:46120
TestRoleUpdateCommand - 2019/12/06 06:14:45.073096 [DEBUG] http: Request PUT /v1/acl/role (670.51522ms) from=127.0.0.1:36732
=== RUN   TestRoleUpdateCommand/update_a_role_that_does_not_exist
TestRoleUpdateCommand - 2019/12/06 06:14:45.079660 [DEBUG] http: Request GET /v1/acl/role/625adbfd-7f52-ee1e-ea3f-73508e56e4f7 (593.014µs) from=127.0.0.1:36736
=== RUN   TestRoleUpdateCommand/update_with_policy_by_name
TestRoleUpdateCommand - 2019/12/06 06:14:45.086863 [DEBUG] http: Request GET /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (874.687µs) from=127.0.0.1:36738
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:45.572583 [DEBUG] http: Request PUT /v1/acl/policy (663.705062ms) from=127.0.0.1:46120
=== RUN   TestRoleUpdateCommand_noMerge/update_a_role_that_does_not_exist
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:45.580185 [DEBUG] http: Request GET /v1/acl/role/fe28a842-9bc9-7235-ee1a-168830178e1f (594.347µs) from=127.0.0.1:46126
=== RUN   TestRoleUpdateCommand_noMerge/update_with_policy_by_name
TestRoleUpdateCommand - 2019/12/06 06:14:45.710064 [DEBUG] http: Request PUT /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (620.548061ms) from=127.0.0.1:36738
TestRoleUpdateCommand - 2019/12/06 06:14:45.720788 [DEBUG] http: Request GET /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (1.122026ms) from=127.0.0.1:36732
=== RUN   TestRoleUpdateCommand/update_with_policy_by_id
TestRoleUpdateCommand - 2019/12/06 06:14:45.731031 [DEBUG] http: Request GET /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (963.355µs) from=127.0.0.1:36742
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:46.088529 [DEBUG] http: Request PUT /v1/acl/role (506.519749ms) from=127.0.0.1:46120
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:46.095823 [DEBUG] http: Request GET /v1/acl/role/19a00823-f471-575b-c83c-9216f5492861 (967.356µs) from=127.0.0.1:46130
TestRoleUpdateCommand - 2019/12/06 06:14:46.422866 [DEBUG] http: Request PUT /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (689.138652ms) from=127.0.0.1:36742
TestRoleUpdateCommand - 2019/12/06 06:14:46.433951 [DEBUG] http: Request GET /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (1.323364ms) from=127.0.0.1:36732
=== RUN   TestRoleUpdateCommand/update_with_service_identity
TestRoleUpdateCommand - 2019/12/06 06:14:46.444624 [DEBUG] http: Request GET /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (1.376699ms) from=127.0.0.1:36746
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:46.855338 [DEBUG] http: Request PUT /v1/acl/role/19a00823-f471-575b-c83c-9216f5492861 (756.767554ms) from=127.0.0.1:46130
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:46.858888 [DEBUG] http: Request GET /v1/acl/role/19a00823-f471-575b-c83c-9216f5492861 (787.684µs) from=127.0.0.1:46120
=== RUN   TestRoleUpdateCommand_noMerge/update_with_policy_by_id
TestRoleUpdateCommand - 2019/12/06 06:14:47.065230 [DEBUG] http: Request PUT /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (616.145626ms) from=127.0.0.1:36746
TestRoleUpdateCommand - 2019/12/06 06:14:47.072397 [DEBUG] http: Request GET /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (1.097026ms) from=127.0.0.1:36732
=== RUN   TestRoleUpdateCommand/update_with_service_identity_scoped_to_2_DCs
TestRoleUpdateCommand - 2019/12/06 06:14:47.081844 [DEBUG] http: Request GET /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (1.010357ms) from=127.0.0.1:36748
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:47.280989 [DEBUG] http: Request PUT /v1/acl/role (419.170723ms) from=127.0.0.1:46120
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:47.288212 [DEBUG] http: Request GET /v1/acl/role/e1b6fd1e-b72d-8418-1ecc-41b973ae9718 (936.355µs) from=127.0.0.1:46136
TestRoleUpdateCommand - 2019/12/06 06:14:47.422918 [DEBUG] http: Request PUT /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (338.335514ms) from=127.0.0.1:36748
TestRoleUpdateCommand - 2019/12/06 06:14:47.428010 [DEBUG] http: Request GET /v1/acl/role/3de4e7a7-1fc3-245b-1145-740671f0102c (1.15036ms) from=127.0.0.1:36732
TestRoleUpdateCommand - 2019/12/06 06:14:47.430562 [INFO] agent: Requesting shutdown
TestRoleUpdateCommand - 2019/12/06 06:14:47.430661 [INFO] consul: shutting down server
TestRoleUpdateCommand - 2019/12/06 06:14:47.430715 [WARN] serf: Shutdown without a Leave
TestRoleUpdateCommand - 2019/12/06 06:14:47.629002 [WARN] serf: Shutdown without a Leave
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:47.731261 [DEBUG] http: Request PUT /v1/acl/role/e1b6fd1e-b72d-8418-1ecc-41b973ae9718 (440.158876ms) from=127.0.0.1:46136
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:47.735442 [DEBUG] http: Request GET /v1/acl/role/e1b6fd1e-b72d-8418-1ecc-41b973ae9718 (1.037024ms) from=127.0.0.1:46120
=== RUN   TestRoleUpdateCommand_noMerge/update_with_service_identity
TestRoleUpdateCommand - 2019/12/06 06:14:47.845702 [INFO] manager: shutting down
TestRoleUpdateCommand - 2019/12/06 06:14:47.846492 [INFO] agent: consul server down
TestRoleUpdateCommand - 2019/12/06 06:14:47.846617 [INFO] agent: shutdown complete
TestRoleUpdateCommand - 2019/12/06 06:14:47.846684 [INFO] agent: Stopping DNS server 127.0.0.1:10007 (tcp)
TestRoleUpdateCommand - 2019/12/06 06:14:47.846836 [INFO] agent: Stopping DNS server 127.0.0.1:10007 (udp)
TestRoleUpdateCommand - 2019/12/06 06:14:47.846998 [INFO] agent: Stopping HTTP server 127.0.0.1:10008 (tcp)
TestRoleUpdateCommand - 2019/12/06 06:14:47.848354 [INFO] agent: Waiting for endpoints to shut down
TestRoleUpdateCommand - 2019/12/06 06:14:47.848525 [INFO] agent: Endpoints down
--- PASS: TestRoleUpdateCommand (12.52s)
    --- PASS: TestRoleUpdateCommand/update_a_role_that_does_not_exist (0.01s)
    --- PASS: TestRoleUpdateCommand/update_with_policy_by_name (0.64s)
    --- PASS: TestRoleUpdateCommand/update_with_policy_by_id (0.71s)
    --- PASS: TestRoleUpdateCommand/update_with_service_identity (0.64s)
    --- PASS: TestRoleUpdateCommand/update_with_service_identity_scoped_to_2_DCs (0.35s)
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:47.997286 [DEBUG] http: Request PUT /v1/acl/role (257.40997ms) from=127.0.0.1:46120
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.005284 [DEBUG] http: Request GET /v1/acl/role/8b800f88-3d42-8097-2ee3-b117285f7952 (1.253696ms) from=127.0.0.1:46138
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.155711 [DEBUG] http: Request PUT /v1/acl/role/8b800f88-3d42-8097-2ee3-b117285f7952 (147.369419ms) from=127.0.0.1:46138
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.159974 [DEBUG] http: Request GET /v1/acl/role/8b800f88-3d42-8097-2ee3-b117285f7952 (1.218362ms) from=127.0.0.1:46120
=== RUN   TestRoleUpdateCommand_noMerge/update_with_service_identity_scoped_to_2_DCs
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.363711 [DEBUG] http: Request PUT /v1/acl/role (187.054339ms) from=127.0.0.1:46120
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.374681 [DEBUG] http: Request GET /v1/acl/role/5a199f1a-77f1-522e-da10-304a6bf3b29f (1.563369ms) from=127.0.0.1:46140
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.697033 [DEBUG] http: Request PUT /v1/acl/role/5a199f1a-77f1-522e-da10-304a6bf3b29f (318.86173ms) from=127.0.0.1:46140
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.700504 [DEBUG] http: Request GET /v1/acl/role/5a199f1a-77f1-522e-da10-304a6bf3b29f (803.352µs) from=127.0.0.1:46120
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.702415 [INFO] agent: Requesting shutdown
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.702494 [INFO] consul: shutting down server
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.702544 [WARN] serf: Shutdown without a Leave
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.845458 [WARN] serf: Shutdown without a Leave
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.978975 [INFO] manager: shutting down
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.979560 [INFO] agent: consul server down
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.979617 [INFO] agent: shutdown complete
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.979680 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.979824 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.979978 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.981328 [INFO] agent: Waiting for endpoints to shut down
TestRoleUpdateCommand_noMerge - 2019/12/06 06:14:48.981520 [INFO] agent: Endpoints down
--- PASS: TestRoleUpdateCommand_noMerge (13.65s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_a_role_that_does_not_exist (0.01s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_with_policy_by_name (1.28s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_with_policy_by_id (0.88s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_with_service_identity (0.44s)
    --- PASS: TestRoleUpdateCommand_noMerge/update_with_service_identity_scoped_to_2_DCs (0.53s)
PASS
ok  	github.com/hashicorp/consul/command/acl/role/update	14.269s
=== RUN   TestRulesTranslateCommand_noTabs
=== PAUSE TestRulesTranslateCommand_noTabs
=== RUN   TestRulesTranslateCommand
=== PAUSE TestRulesTranslateCommand
=== CONT  TestRulesTranslateCommand_noTabs
=== CONT  TestRulesTranslateCommand
--- PASS: TestRulesTranslateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRulesTranslateCommand - 2019/12/06 06:18:41.453493 [WARN] agent: Node name "Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRulesTranslateCommand - 2019/12/06 06:18:41.454141 [DEBUG] tlsutil: Update with version 1
TestRulesTranslateCommand - 2019/12/06 06:18:41.486679 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:18:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ea611b6e-1c4b-e6a4-fc63-81971bff3c36 Address:127.0.0.1:35506}]
2019/12/06 06:18:43 [INFO]  raft: Node at 127.0.0.1:35506 [Follower] entering Follower state (Leader: "")
TestRulesTranslateCommand - 2019/12/06 06:18:43.220609 [INFO] serf: EventMemberJoin: Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36.dc1 127.0.0.1
TestRulesTranslateCommand - 2019/12/06 06:18:43.223796 [INFO] serf: EventMemberJoin: Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36 127.0.0.1
TestRulesTranslateCommand - 2019/12/06 06:18:43.224849 [INFO] consul: Adding LAN server Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36 (Addr: tcp/127.0.0.1:35506) (DC: dc1)
TestRulesTranslateCommand - 2019/12/06 06:18:43.225199 [INFO] consul: Handled member-join event for server "Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36.dc1" in area "wan"
TestRulesTranslateCommand - 2019/12/06 06:18:43.225468 [INFO] agent: Started DNS server 127.0.0.1:35501 (udp)
TestRulesTranslateCommand - 2019/12/06 06:18:43.225683 [INFO] agent: Started DNS server 127.0.0.1:35501 (tcp)
TestRulesTranslateCommand - 2019/12/06 06:18:43.228062 [INFO] agent: Started HTTP server on 127.0.0.1:35502 (tcp)
TestRulesTranslateCommand - 2019/12/06 06:18:43.228170 [INFO] agent: started state syncer
2019/12/06 06:18:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:18:43 [INFO]  raft: Node at 127.0.0.1:35506 [Candidate] entering Candidate state in term 2
2019/12/06 06:18:44 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:18:44 [INFO]  raft: Node at 127.0.0.1:35506 [Leader] entering Leader state
TestRulesTranslateCommand - 2019/12/06 06:18:44.199822 [INFO] consul: cluster leadership acquired
TestRulesTranslateCommand - 2019/12/06 06:18:44.200365 [INFO] consul: New leader elected: Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36
TestRulesTranslateCommand - 2019/12/06 06:18:44.363319 [ERR] agent: failed to sync remote state: ACL not found
TestRulesTranslateCommand - 2019/12/06 06:18:44.467118 [ERR] agent: failed to sync remote state: ACL not found
TestRulesTranslateCommand - 2019/12/06 06:18:44.776615 [INFO] acl: initializing acls
TestRulesTranslateCommand - 2019/12/06 06:18:45.494335 [INFO] consul: Created ACL 'global-management' policy
TestRulesTranslateCommand - 2019/12/06 06:18:45.494439 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRulesTranslateCommand - 2019/12/06 06:18:45.495036 [INFO] acl: initializing acls
TestRulesTranslateCommand - 2019/12/06 06:18:45.495368 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRulesTranslateCommand - 2019/12/06 06:18:46.133860 [INFO] consul: Bootstrapped ACL master token from configuration
TestRulesTranslateCommand - 2019/12/06 06:18:46.134073 [INFO] consul: Bootstrapped ACL master token from configuration
TestRulesTranslateCommand - 2019/12/06 06:18:46.501239 [INFO] consul: Created ACL anonymous token from configuration
TestRulesTranslateCommand - 2019/12/06 06:18:46.501895 [DEBUG] acl: transitioning out of legacy ACL mode
TestRulesTranslateCommand - 2019/12/06 06:18:46.501939 [INFO] consul: Created ACL anonymous token from configuration
TestRulesTranslateCommand - 2019/12/06 06:18:46.503659 [INFO] serf: EventMemberUpdate: Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36
TestRulesTranslateCommand - 2019/12/06 06:18:46.504380 [INFO] serf: EventMemberUpdate: Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36.dc1
TestRulesTranslateCommand - 2019/12/06 06:18:46.505992 [INFO] serf: EventMemberUpdate: Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36
TestRulesTranslateCommand - 2019/12/06 06:18:46.506956 [INFO] serf: EventMemberUpdate: Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36.dc1
TestRulesTranslateCommand - 2019/12/06 06:18:48.600073 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRulesTranslateCommand - 2019/12/06 06:18:48.601047 [DEBUG] consul: Skipping self join check for "Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36" since the cluster is too small
TestRulesTranslateCommand - 2019/12/06 06:18:48.601185 [INFO] consul: member 'Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36' joined, marking health alive
TestRulesTranslateCommand - 2019/12/06 06:18:49.041556 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRulesTranslateCommand - 2019/12/06 06:18:49.043835 [DEBUG] consul: Skipping self join check for "Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36" since the cluster is too small
TestRulesTranslateCommand - 2019/12/06 06:18:49.044472 [DEBUG] consul: Skipping self join check for "Node ea611b6e-1c4b-e6a4-fc63-81971bff3c36" since the cluster is too small
=== RUN   TestRulesTranslateCommand/file
=== RUN   TestRulesTranslateCommand/stdin
=== RUN   TestRulesTranslateCommand/arg
=== RUN   TestRulesTranslateCommand/exclusive-options
TestRulesTranslateCommand - 2019/12/06 06:18:49.060716 [INFO] agent: Requesting shutdown
TestRulesTranslateCommand - 2019/12/06 06:18:49.060790 [INFO] consul: shutting down server
TestRulesTranslateCommand - 2019/12/06 06:18:49.060832 [WARN] serf: Shutdown without a Leave
TestRulesTranslateCommand - 2019/12/06 06:18:49.382792 [WARN] serf: Shutdown without a Leave
TestRulesTranslateCommand - 2019/12/06 06:18:49.566154 [INFO] manager: shutting down
TestRulesTranslateCommand - 2019/12/06 06:18:49.566846 [INFO] agent: consul server down
TestRulesTranslateCommand - 2019/12/06 06:18:49.566904 [INFO] agent: shutdown complete
TestRulesTranslateCommand - 2019/12/06 06:18:49.566961 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (tcp)
TestRulesTranslateCommand - 2019/12/06 06:18:49.567117 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (udp)
TestRulesTranslateCommand - 2019/12/06 06:18:49.567290 [INFO] agent: Stopping HTTP server 127.0.0.1:35502 (tcp)
TestRulesTranslateCommand - 2019/12/06 06:18:49.567515 [INFO] agent: Waiting for endpoints to shut down
TestRulesTranslateCommand - 2019/12/06 06:18:49.567594 [INFO] agent: Endpoints down
--- PASS: TestRulesTranslateCommand (8.99s)
    --- PASS: TestRulesTranslateCommand/file (0.00s)
    --- PASS: TestRulesTranslateCommand/stdin (0.00s)
    --- PASS: TestRulesTranslateCommand/arg (0.00s)
    --- PASS: TestRulesTranslateCommand/exclusive-options (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/acl/rules	9.538s
?   	github.com/hashicorp/consul/command/acl/token	[no test files]
=== RUN   TestTokenCloneCommand_noTabs
=== PAUSE TestTokenCloneCommand_noTabs
=== RUN   TestTokenCloneCommand
=== PAUSE TestTokenCloneCommand
=== CONT  TestTokenCloneCommand_noTabs
=== CONT  TestTokenCloneCommand
--- PASS: TestTokenCloneCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenCloneCommand - 2019/12/06 06:18:42.768177 [WARN] agent: Node name "Node c452786e-a334-1c7c-297a-e6745a6a2228" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenCloneCommand - 2019/12/06 06:18:42.768736 [DEBUG] tlsutil: Update with version 1
TestTokenCloneCommand - 2019/12/06 06:18:42.783825 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:18:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c452786e-a334-1c7c-297a-e6745a6a2228 Address:127.0.0.1:41506}]
2019/12/06 06:18:44 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
2019/12/06 06:18:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:18:44 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
TestTokenCloneCommand - 2019/12/06 06:18:45.007170 [INFO] serf: EventMemberJoin: Node c452786e-a334-1c7c-297a-e6745a6a2228.dc1 127.0.0.1
TestTokenCloneCommand - 2019/12/06 06:18:45.010837 [INFO] serf: EventMemberJoin: Node c452786e-a334-1c7c-297a-e6745a6a2228 127.0.0.1
TestTokenCloneCommand - 2019/12/06 06:18:45.011964 [INFO] consul: Adding LAN server Node c452786e-a334-1c7c-297a-e6745a6a2228 (Addr: tcp/127.0.0.1:41506) (DC: dc1)
TestTokenCloneCommand - 2019/12/06 06:18:45.012688 [INFO] consul: Handled member-join event for server "Node c452786e-a334-1c7c-297a-e6745a6a2228.dc1" in area "wan"
TestTokenCloneCommand - 2019/12/06 06:18:45.013858 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
TestTokenCloneCommand - 2019/12/06 06:18:45.014507 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
TestTokenCloneCommand - 2019/12/06 06:18:45.017232 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
TestTokenCloneCommand - 2019/12/06 06:18:45.017416 [INFO] agent: started state syncer
2019/12/06 06:18:46 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:18:46 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
TestTokenCloneCommand - 2019/12/06 06:18:46.031168 [INFO] consul: cluster leadership acquired
TestTokenCloneCommand - 2019/12/06 06:18:46.031719 [INFO] consul: New leader elected: Node c452786e-a334-1c7c-297a-e6745a6a2228
TestTokenCloneCommand - 2019/12/06 06:18:46.138264 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCloneCommand - 2019/12/06 06:18:46.512391 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCloneCommand - 2019/12/06 06:18:46.563930 [INFO] acl: initializing acls
TestTokenCloneCommand - 2019/12/06 06:18:47.374771 [INFO] acl: initializing acls
TestTokenCloneCommand - 2019/12/06 06:18:47.375339 [INFO] consul: Created ACL 'global-management' policy
TestTokenCloneCommand - 2019/12/06 06:18:47.375393 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCloneCommand - 2019/12/06 06:18:47.641975 [INFO] consul: Created ACL 'global-management' policy
TestTokenCloneCommand - 2019/12/06 06:18:47.642064 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCloneCommand - 2019/12/06 06:18:47.642816 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCloneCommand - 2019/12/06 06:18:48.601475 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCloneCommand - 2019/12/06 06:18:48.602100 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCloneCommand - 2019/12/06 06:18:48.602209 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenCloneCommand - 2019/12/06 06:18:48.603251 [INFO] serf: EventMemberUpdate: Node c452786e-a334-1c7c-297a-e6745a6a2228
TestTokenCloneCommand - 2019/12/06 06:18:48.603952 [INFO] serf: EventMemberUpdate: Node c452786e-a334-1c7c-297a-e6745a6a2228.dc1
TestTokenCloneCommand - 2019/12/06 06:18:49.049571 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCloneCommand - 2019/12/06 06:18:49.050835 [INFO] serf: EventMemberUpdate: Node c452786e-a334-1c7c-297a-e6745a6a2228
TestTokenCloneCommand - 2019/12/06 06:18:49.051804 [INFO] serf: EventMemberUpdate: Node c452786e-a334-1c7c-297a-e6745a6a2228.dc1
TestTokenCloneCommand - 2019/12/06 06:18:51.725205 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenCloneCommand - 2019/12/06 06:18:51.725469 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenCloneCommand - 2019/12/06 06:18:51.725987 [DEBUG] consul: Skipping self join check for "Node c452786e-a334-1c7c-297a-e6745a6a2228" since the cluster is too small
TestTokenCloneCommand - 2019/12/06 06:18:51.726095 [INFO] consul: member 'Node c452786e-a334-1c7c-297a-e6745a6a2228' joined, marking health alive
TestTokenCloneCommand - 2019/12/06 06:18:52.519168 [DEBUG] consul: Skipping self join check for "Node c452786e-a334-1c7c-297a-e6745a6a2228" since the cluster is too small
TestTokenCloneCommand - 2019/12/06 06:18:52.519879 [DEBUG] consul: Skipping self join check for "Node c452786e-a334-1c7c-297a-e6745a6a2228" since the cluster is too small
TestTokenCloneCommand - 2019/12/06 06:18:54.060068 [DEBUG] http: Request PUT /v1/acl/policy (1.520121943s) from=127.0.0.1:36292
TestTokenCloneCommand - 2019/12/06 06:18:54.602276 [DEBUG] http: Request PUT /v1/acl/token (519.660393ms) from=127.0.0.1:36292
=== RUN   TestTokenCloneCommand/Description
TestTokenCloneCommand - 2019/12/06 06:18:55.018997 [DEBUG] http: Request PUT /v1/acl/token/0c6416b7-0ce6-935c-5dbb-d88d90b03333/clone (409.388834ms) from=127.0.0.1:36294
TestTokenCloneCommand - 2019/12/06 06:18:55.027835 [DEBUG] http: Request GET /v1/acl/token/d8bd065b-4200-21d9-697e-dca9b3546ea9 (1.168027ms) from=127.0.0.1:36292
=== RUN   TestTokenCloneCommand/Without_Description
TestTokenCloneCommand - 2019/12/06 06:18:55.584111 [DEBUG] http: Request PUT /v1/acl/token/0c6416b7-0ce6-935c-5dbb-d88d90b03333/clone (549.585754ms) from=127.0.0.1:36296
TestTokenCloneCommand - 2019/12/06 06:18:55.595969 [DEBUG] http: Request GET /v1/acl/token/b97ab730-b176-6268-7cb6-0c9840f5981e (2.192718ms) from=127.0.0.1:36292
TestTokenCloneCommand - 2019/12/06 06:18:55.598822 [INFO] agent: Requesting shutdown
TestTokenCloneCommand - 2019/12/06 06:18:55.598952 [INFO] consul: shutting down server
TestTokenCloneCommand - 2019/12/06 06:18:55.599043 [WARN] serf: Shutdown without a Leave
TestTokenCloneCommand - 2019/12/06 06:18:55.757791 [WARN] serf: Shutdown without a Leave
TestTokenCloneCommand - 2019/12/06 06:18:55.866164 [INFO] manager: shutting down
TestTokenCloneCommand - 2019/12/06 06:18:55.866821 [INFO] agent: consul server down
TestTokenCloneCommand - 2019/12/06 06:18:55.866877 [INFO] agent: shutdown complete
TestTokenCloneCommand - 2019/12/06 06:18:55.866929 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
TestTokenCloneCommand - 2019/12/06 06:18:55.867059 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
TestTokenCloneCommand - 2019/12/06 06:18:55.867196 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
TestTokenCloneCommand - 2019/12/06 06:18:55.867935 [INFO] agent: Waiting for endpoints to shut down
TestTokenCloneCommand - 2019/12/06 06:18:55.868081 [INFO] agent: Endpoints down
--- PASS: TestTokenCloneCommand (13.18s)
    --- PASS: TestTokenCloneCommand/Description (0.42s)
    --- PASS: TestTokenCloneCommand/Without_Description (0.57s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/clone	13.504s
=== RUN   TestTokenCreateCommand_noTabs
=== PAUSE TestTokenCreateCommand_noTabs
=== RUN   TestTokenCreateCommand
=== PAUSE TestTokenCreateCommand
=== CONT  TestTokenCreateCommand_noTabs
=== CONT  TestTokenCreateCommand
--- PASS: TestTokenCreateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenCreateCommand - 2019/12/06 06:18:41.454619 [WARN] agent: Node name "Node d53b5f14-60c1-8838-5738-7e055464f43c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenCreateCommand - 2019/12/06 06:18:41.458498 [DEBUG] tlsutil: Update with version 1
TestTokenCreateCommand - 2019/12/06 06:18:41.482706 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:18:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d53b5f14-60c1-8838-5738-7e055464f43c Address:127.0.0.1:37006}]
2019/12/06 06:18:43 [INFO]  raft: Node at 127.0.0.1:37006 [Follower] entering Follower state (Leader: "")
TestTokenCreateCommand - 2019/12/06 06:18:43.356036 [INFO] serf: EventMemberJoin: Node d53b5f14-60c1-8838-5738-7e055464f43c.dc1 127.0.0.1
TestTokenCreateCommand - 2019/12/06 06:18:43.361388 [INFO] serf: EventMemberJoin: Node d53b5f14-60c1-8838-5738-7e055464f43c 127.0.0.1
TestTokenCreateCommand - 2019/12/06 06:18:43.362656 [INFO] consul: Adding LAN server Node d53b5f14-60c1-8838-5738-7e055464f43c (Addr: tcp/127.0.0.1:37006) (DC: dc1)
TestTokenCreateCommand - 2019/12/06 06:18:43.363147 [INFO] consul: Handled member-join event for server "Node d53b5f14-60c1-8838-5738-7e055464f43c.dc1" in area "wan"
TestTokenCreateCommand - 2019/12/06 06:18:43.363802 [INFO] agent: Started DNS server 127.0.0.1:37001 (tcp)
TestTokenCreateCommand - 2019/12/06 06:18:43.363905 [INFO] agent: Started DNS server 127.0.0.1:37001 (udp)
TestTokenCreateCommand - 2019/12/06 06:18:43.366402 [INFO] agent: Started HTTP server on 127.0.0.1:37002 (tcp)
TestTokenCreateCommand - 2019/12/06 06:18:43.366539 [INFO] agent: started state syncer
2019/12/06 06:18:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:18:43 [INFO]  raft: Node at 127.0.0.1:37006 [Candidate] entering Candidate state in term 2
2019/12/06 06:18:44 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:18:44 [INFO]  raft: Node at 127.0.0.1:37006 [Leader] entering Leader state
TestTokenCreateCommand - 2019/12/06 06:18:44.433266 [INFO] consul: cluster leadership acquired
TestTokenCreateCommand - 2019/12/06 06:18:44.433813 [INFO] consul: New leader elected: Node d53b5f14-60c1-8838-5738-7e055464f43c
TestTokenCreateCommand - 2019/12/06 06:18:44.696934 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCreateCommand - 2019/12/06 06:18:44.913975 [INFO] acl: initializing acls
TestTokenCreateCommand - 2019/12/06 06:18:45.758362 [INFO] acl: initializing acls
TestTokenCreateCommand - 2019/12/06 06:18:45.759411 [INFO] consul: Created ACL 'global-management' policy
TestTokenCreateCommand - 2019/12/06 06:18:45.759684 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCreateCommand - 2019/12/06 06:18:46.027599 [INFO] consul: Created ACL 'global-management' policy
TestTokenCreateCommand - 2019/12/06 06:18:46.029583 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCreateCommand - 2019/12/06 06:18:46.028523 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCreateCommand - 2019/12/06 06:18:46.655458 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCreateCommand - 2019/12/06 06:18:46.659046 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCreateCommand - 2019/12/06 06:18:46.660250 [INFO] serf: EventMemberUpdate: Node d53b5f14-60c1-8838-5738-7e055464f43c
TestTokenCreateCommand - 2019/12/06 06:18:46.661019 [INFO] serf: EventMemberUpdate: Node d53b5f14-60c1-8838-5738-7e055464f43c.dc1
TestTokenCreateCommand - 2019/12/06 06:18:47.159116 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCreateCommand - 2019/12/06 06:18:47.159257 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenCreateCommand - 2019/12/06 06:18:47.160179 [INFO] serf: EventMemberUpdate: Node d53b5f14-60c1-8838-5738-7e055464f43c
TestTokenCreateCommand - 2019/12/06 06:18:47.160937 [INFO] serf: EventMemberUpdate: Node d53b5f14-60c1-8838-5738-7e055464f43c.dc1
TestTokenCreateCommand - 2019/12/06 06:18:49.191686 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenCreateCommand - 2019/12/06 06:18:49.192261 [DEBUG] consul: Skipping self join check for "Node d53b5f14-60c1-8838-5738-7e055464f43c" since the cluster is too small
TestTokenCreateCommand - 2019/12/06 06:18:49.192427 [INFO] consul: member 'Node d53b5f14-60c1-8838-5738-7e055464f43c' joined, marking health alive
TestTokenCreateCommand - 2019/12/06 06:18:49.570040 [DEBUG] consul: Skipping self join check for "Node d53b5f14-60c1-8838-5738-7e055464f43c" since the cluster is too small
TestTokenCreateCommand - 2019/12/06 06:18:49.570658 [DEBUG] consul: Skipping self join check for "Node d53b5f14-60c1-8838-5738-7e055464f43c" since the cluster is too small
TestTokenCreateCommand - 2019/12/06 06:18:50.311657 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenCreateCommand - 2019/12/06 06:18:50.313300 [DEBUG] http: Request PUT /v1/acl/policy (732.267993ms) from=127.0.0.1:33458
TestTokenCreateCommand - 2019/12/06 06:18:50.852557 [DEBUG] http: Request PUT /v1/acl/token (524.747177ms) from=127.0.0.1:33460
TestTokenCreateCommand - 2019/12/06 06:18:51.502036 [DEBUG] http: Request PUT /v1/acl/token (644.099947ms) from=127.0.0.1:33462
TestTokenCreateCommand - 2019/12/06 06:18:52.385885 [DEBUG] http: Request PUT /v1/acl/token (875.859992ms) from=127.0.0.1:33464
TestTokenCreateCommand - 2019/12/06 06:18:52.391678 [DEBUG] http: Request GET /v1/acl/token/3d852bb8-5153-4388-a3ca-8ca78661889f (1.777375ms) from=127.0.0.1:33466
TestTokenCreateCommand - 2019/12/06 06:18:52.393562 [INFO] agent: Requesting shutdown
TestTokenCreateCommand - 2019/12/06 06:18:52.393649 [INFO] consul: shutting down server
TestTokenCreateCommand - 2019/12/06 06:18:52.393702 [WARN] serf: Shutdown without a Leave
TestTokenCreateCommand - 2019/12/06 06:18:52.649390 [WARN] serf: Shutdown without a Leave
TestTokenCreateCommand - 2019/12/06 06:18:53.726640 [INFO] manager: shutting down
TestTokenCreateCommand - 2019/12/06 06:18:53.727836 [INFO] agent: consul server down
TestTokenCreateCommand - 2019/12/06 06:18:53.727902 [INFO] agent: shutdown complete
TestTokenCreateCommand - 2019/12/06 06:18:53.727966 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (tcp)
TestTokenCreateCommand - 2019/12/06 06:18:53.728109 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (udp)
TestTokenCreateCommand - 2019/12/06 06:18:53.728286 [INFO] agent: Stopping HTTP server 127.0.0.1:37002 (tcp)
TestTokenCreateCommand - 2019/12/06 06:18:53.729727 [INFO] agent: Waiting for endpoints to shut down
TestTokenCreateCommand - 2019/12/06 06:18:53.729981 [INFO] agent: Endpoints down
--- PASS: TestTokenCreateCommand (13.50s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/create	13.873s
=== RUN   TestTokenDeleteCommand_noTabs
=== PAUSE TestTokenDeleteCommand_noTabs
=== RUN   TestTokenDeleteCommand
=== PAUSE TestTokenDeleteCommand
=== CONT  TestTokenDeleteCommand_noTabs
=== CONT  TestTokenDeleteCommand
--- PASS: TestTokenDeleteCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenDeleteCommand - 2019/12/06 06:18:38.822732 [WARN] agent: Node name "Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenDeleteCommand - 2019/12/06 06:18:38.905476 [DEBUG] tlsutil: Update with version 1
TestTokenDeleteCommand - 2019/12/06 06:18:38.911292 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:18:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4e753dd4-8e88-9733-9454-c5ef2aaceff2 Address:127.0.0.1:34006}]
2019/12/06 06:18:39 [INFO]  raft: Node at 127.0.0.1:34006 [Follower] entering Follower state (Leader: "")
TestTokenDeleteCommand - 2019/12/06 06:18:39.996599 [INFO] serf: EventMemberJoin: Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2.dc1 127.0.0.1
TestTokenDeleteCommand - 2019/12/06 06:18:40.000723 [INFO] serf: EventMemberJoin: Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2 127.0.0.1
TestTokenDeleteCommand - 2019/12/06 06:18:40.001631 [INFO] consul: Adding LAN server Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2 (Addr: tcp/127.0.0.1:34006) (DC: dc1)
TestTokenDeleteCommand - 2019/12/06 06:18:40.001829 [INFO] consul: Handled member-join event for server "Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2.dc1" in area "wan"
TestTokenDeleteCommand - 2019/12/06 06:18:40.002933 [INFO] agent: Started DNS server 127.0.0.1:34001 (udp)
TestTokenDeleteCommand - 2019/12/06 06:18:40.003275 [INFO] agent: Started DNS server 127.0.0.1:34001 (tcp)
TestTokenDeleteCommand - 2019/12/06 06:18:40.005881 [INFO] agent: Started HTTP server on 127.0.0.1:34002 (tcp)
TestTokenDeleteCommand - 2019/12/06 06:18:40.006010 [INFO] agent: started state syncer
2019/12/06 06:18:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:18:40 [INFO]  raft: Node at 127.0.0.1:34006 [Candidate] entering Candidate state in term 2
2019/12/06 06:18:40 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:18:40 [INFO]  raft: Node at 127.0.0.1:34006 [Leader] entering Leader state
TestTokenDeleteCommand - 2019/12/06 06:18:40.916401 [INFO] consul: cluster leadership acquired
TestTokenDeleteCommand - 2019/12/06 06:18:40.916883 [INFO] consul: New leader elected: Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2
TestTokenDeleteCommand - 2019/12/06 06:18:41.076283 [ERR] agent: failed to sync remote state: ACL not found
TestTokenDeleteCommand - 2019/12/06 06:18:41.341508 [INFO] acl: initializing acls
TestTokenDeleteCommand - 2019/12/06 06:18:41.555360 [INFO] acl: initializing acls
TestTokenDeleteCommand - 2019/12/06 06:18:41.925740 [INFO] consul: Created ACL 'global-management' policy
TestTokenDeleteCommand - 2019/12/06 06:18:41.925815 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenDeleteCommand - 2019/12/06 06:18:41.926648 [INFO] consul: Created ACL 'global-management' policy
TestTokenDeleteCommand - 2019/12/06 06:18:41.926706 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenDeleteCommand - 2019/12/06 06:18:42.685068 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenDeleteCommand - 2019/12/06 06:18:42.685140 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenDeleteCommand - 2019/12/06 06:18:42.771190 [ERR] agent: failed to sync remote state: ACL not found
TestTokenDeleteCommand - 2019/12/06 06:18:43.043068 [INFO] consul: Created ACL anonymous token from configuration
TestTokenDeleteCommand - 2019/12/06 06:18:43.044051 [INFO] serf: EventMemberUpdate: Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2
TestTokenDeleteCommand - 2019/12/06 06:18:43.044754 [INFO] serf: EventMemberUpdate: Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2.dc1
TestTokenDeleteCommand - 2019/12/06 06:18:43.353518 [INFO] consul: Created ACL anonymous token from configuration
TestTokenDeleteCommand - 2019/12/06 06:18:43.353682 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenDeleteCommand - 2019/12/06 06:18:43.355180 [INFO] serf: EventMemberUpdate: Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2
TestTokenDeleteCommand - 2019/12/06 06:18:43.356408 [INFO] serf: EventMemberUpdate: Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2.dc1
TestTokenDeleteCommand - 2019/12/06 06:18:45.491914 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenDeleteCommand - 2019/12/06 06:18:45.492526 [DEBUG] consul: Skipping self join check for "Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2" since the cluster is too small
TestTokenDeleteCommand - 2019/12/06 06:18:45.492630 [INFO] consul: member 'Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2' joined, marking health alive
TestTokenDeleteCommand - 2019/12/06 06:18:45.758205 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenDeleteCommand - 2019/12/06 06:18:45.761862 [DEBUG] consul: Skipping self join check for "Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2" since the cluster is too small
TestTokenDeleteCommand - 2019/12/06 06:18:45.762431 [DEBUG] consul: Skipping self join check for "Node 4e753dd4-8e88-9733-9454-c5ef2aaceff2" since the cluster is too small
TestTokenDeleteCommand - 2019/12/06 06:18:46.033681 [DEBUG] http: Request PUT /v1/acl/token (261.039057ms) from=127.0.0.1:55674
TestTokenDeleteCommand - 2019/12/06 06:18:46.504956 [DEBUG] http: Request DELETE /v1/acl/token/59052582-1776-3542-16c1-3337dd3d15d5 (459.266324ms) from=127.0.0.1:55676
TestTokenDeleteCommand - 2019/12/06 06:18:46.510619 [ERR] http: Request GET /v1/acl/token/59052582-1776-3542-16c1-3337dd3d15d5, error: ACL not found from=127.0.0.1:55674
TestTokenDeleteCommand - 2019/12/06 06:18:46.511730 [DEBUG] http: Request GET /v1/acl/token/59052582-1776-3542-16c1-3337dd3d15d5 (1.612037ms) from=127.0.0.1:55674
TestTokenDeleteCommand - 2019/12/06 06:18:46.513448 [INFO] agent: Requesting shutdown
TestTokenDeleteCommand - 2019/12/06 06:18:46.513518 [INFO] consul: shutting down server
TestTokenDeleteCommand - 2019/12/06 06:18:46.513566 [WARN] serf: Shutdown without a Leave
TestTokenDeleteCommand - 2019/12/06 06:18:46.657744 [WARN] serf: Shutdown without a Leave
TestTokenDeleteCommand - 2019/12/06 06:18:46.791275 [INFO] manager: shutting down
TestTokenDeleteCommand - 2019/12/06 06:18:46.791795 [INFO] agent: consul server down
TestTokenDeleteCommand - 2019/12/06 06:18:46.791852 [INFO] agent: shutdown complete
TestTokenDeleteCommand - 2019/12/06 06:18:46.791907 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (tcp)
TestTokenDeleteCommand - 2019/12/06 06:18:46.792039 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (udp)
TestTokenDeleteCommand - 2019/12/06 06:18:46.792187 [INFO] agent: Stopping HTTP server 127.0.0.1:34002 (tcp)
TestTokenDeleteCommand - 2019/12/06 06:18:46.792668 [INFO] agent: Waiting for endpoints to shut down
TestTokenDeleteCommand - 2019/12/06 06:18:46.792776 [INFO] agent: Endpoints down
--- PASS: TestTokenDeleteCommand (8.20s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/delete	9.847s
=== RUN   TestTokenListCommand_noTabs
=== PAUSE TestTokenListCommand_noTabs
=== RUN   TestTokenListCommand
=== PAUSE TestTokenListCommand
=== CONT  TestTokenListCommand_noTabs
=== CONT  TestTokenListCommand
--- PASS: TestTokenListCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenListCommand - 2019/12/06 06:25:04.328568 [WARN] agent: Node name "Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenListCommand - 2019/12/06 06:25:04.871611 [DEBUG] tlsutil: Update with version 1
TestTokenListCommand - 2019/12/06 06:25:04.914231 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:25:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac Address:127.0.0.1:19006}]
2019/12/06 06:25:07 [INFO]  raft: Node at 127.0.0.1:19006 [Follower] entering Follower state (Leader: "")
TestTokenListCommand - 2019/12/06 06:25:07.497525 [INFO] serf: EventMemberJoin: Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac.dc1 127.0.0.1
TestTokenListCommand - 2019/12/06 06:25:07.502249 [INFO] serf: EventMemberJoin: Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac 127.0.0.1
TestTokenListCommand - 2019/12/06 06:25:07.503314 [INFO] consul: Handled member-join event for server "Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac.dc1" in area "wan"
TestTokenListCommand - 2019/12/06 06:25:07.504289 [INFO] agent: Started DNS server 127.0.0.1:19001 (udp)
TestTokenListCommand - 2019/12/06 06:25:07.504390 [INFO] agent: Started DNS server 127.0.0.1:19001 (tcp)
TestTokenListCommand - 2019/12/06 06:25:07.507120 [INFO] agent: Started HTTP server on 127.0.0.1:19002 (tcp)
TestTokenListCommand - 2019/12/06 06:25:07.507303 [INFO] agent: started state syncer
TestTokenListCommand - 2019/12/06 06:25:07.508530 [INFO] consul: Adding LAN server Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac (Addr: tcp/127.0.0.1:19006) (DC: dc1)
2019/12/06 06:25:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:25:07 [INFO]  raft: Node at 127.0.0.1:19006 [Candidate] entering Candidate state in term 2
2019/12/06 06:25:09 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:25:09 [INFO]  raft: Node at 127.0.0.1:19006 [Leader] entering Leader state
TestTokenListCommand - 2019/12/06 06:25:09.116730 [INFO] consul: cluster leadership acquired
TestTokenListCommand - 2019/12/06 06:25:09.117610 [INFO] consul: New leader elected: Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac
TestTokenListCommand - 2019/12/06 06:25:09.174870 [ERR] agent: failed to sync remote state: ACL not found
TestTokenListCommand - 2019/12/06 06:25:10.023059 [INFO] acl: initializing acls
TestTokenListCommand - 2019/12/06 06:25:10.465852 [INFO] consul: Created ACL 'global-management' policy
TestTokenListCommand - 2019/12/06 06:25:10.465929 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenListCommand - 2019/12/06 06:25:10.657289 [INFO] acl: initializing acls
TestTokenListCommand - 2019/12/06 06:25:10.657419 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenListCommand - 2019/12/06 06:25:10.834805 [ERR] agent: failed to sync remote state: ACL not found
TestTokenListCommand - 2019/12/06 06:25:10.933621 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenListCommand - 2019/12/06 06:25:11.423942 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenListCommand - 2019/12/06 06:25:11.891172 [INFO] consul: Created ACL anonymous token from configuration
TestTokenListCommand - 2019/12/06 06:25:11.892153 [INFO] serf: EventMemberUpdate: Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac
TestTokenListCommand - 2019/12/06 06:25:11.892821 [INFO] serf: EventMemberUpdate: Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac.dc1
TestTokenListCommand - 2019/12/06 06:25:12.991721 [INFO] consul: Created ACL anonymous token from configuration
TestTokenListCommand - 2019/12/06 06:25:12.991793 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenListCommand - 2019/12/06 06:25:12.992639 [INFO] serf: EventMemberUpdate: Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac
TestTokenListCommand - 2019/12/06 06:25:12.993243 [INFO] serf: EventMemberUpdate: Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac.dc1
TestTokenListCommand - 2019/12/06 06:25:15.256186 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenListCommand - 2019/12/06 06:25:15.257322 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenListCommand - 2019/12/06 06:25:15.257841 [DEBUG] consul: Skipping self join check for "Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac" since the cluster is too small
TestTokenListCommand - 2019/12/06 06:25:15.257972 [INFO] consul: member 'Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac' joined, marking health alive
TestTokenListCommand - 2019/12/06 06:25:15.892630 [DEBUG] consul: Skipping self join check for "Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac" since the cluster is too small
TestTokenListCommand - 2019/12/06 06:25:15.894670 [DEBUG] consul: Skipping self join check for "Node 38c8bdaa-a15d-11bf-feaf-fc6ea25da0ac" since the cluster is too small
TestTokenListCommand - 2019/12/06 06:25:16.355888 [DEBUG] http: Request PUT /v1/acl/token (436.427468ms) from=127.0.0.1:52804
TestTokenListCommand - 2019/12/06 06:25:16.835085 [DEBUG] http: Request PUT /v1/acl/token (466.297162ms) from=127.0.0.1:52804
TestTokenListCommand - 2019/12/06 06:25:17.587328 [DEBUG] http: Request PUT /v1/acl/token (742.326904ms) from=127.0.0.1:52804
TestTokenListCommand - 2019/12/06 06:25:18.332472 [DEBUG] http: Request PUT /v1/acl/token (742.380572ms) from=127.0.0.1:52804
TestTokenListCommand - 2019/12/06 06:25:19.257859 [DEBUG] http: Request PUT /v1/acl/token (922.550422ms) from=127.0.0.1:52804
TestTokenListCommand - 2019/12/06 06:25:19.263542 [DEBUG] http: Request GET /v1/acl/tokens (2.60206ms) from=127.0.0.1:52812
TestTokenListCommand - 2019/12/06 06:25:19.268775 [INFO] agent: Requesting shutdown
TestTokenListCommand - 2019/12/06 06:25:19.268919 [INFO] consul: shutting down server
TestTokenListCommand - 2019/12/06 06:25:19.269297 [WARN] serf: Shutdown without a Leave
TestTokenListCommand - 2019/12/06 06:25:19.680713 [WARN] serf: Shutdown without a Leave
TestTokenListCommand - 2019/12/06 06:25:19.814367 [INFO] manager: shutting down
TestTokenListCommand - 2019/12/06 06:25:19.815877 [INFO] agent: consul server down
TestTokenListCommand - 2019/12/06 06:25:19.815936 [INFO] agent: shutdown complete
TestTokenListCommand - 2019/12/06 06:25:19.815993 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (tcp)
TestTokenListCommand - 2019/12/06 06:25:19.816140 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (udp)
TestTokenListCommand - 2019/12/06 06:25:19.816277 [INFO] agent: Stopping HTTP server 127.0.0.1:19002 (tcp)
TestTokenListCommand - 2019/12/06 06:25:19.816855 [INFO] agent: Waiting for endpoints to shut down
TestTokenListCommand - 2019/12/06 06:25:19.817008 [INFO] agent: Endpoints down
--- PASS: TestTokenListCommand (15.70s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/list	17.027s
=== RUN   TestTokenReadCommand_noTabs
=== PAUSE TestTokenReadCommand_noTabs
=== RUN   TestTokenReadCommand
=== PAUSE TestTokenReadCommand
=== CONT  TestTokenReadCommand_noTabs
=== CONT  TestTokenReadCommand
--- PASS: TestTokenReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenReadCommand - 2019/12/06 06:25:04.240287 [WARN] agent: Node name "Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenReadCommand - 2019/12/06 06:25:04.868200 [DEBUG] tlsutil: Update with version 1
TestTokenReadCommand - 2019/12/06 06:25:04.885012 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:25:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:720ef0f4-7de2-a231-a046-f45e1dd5a8c6 Address:127.0.0.1:38506}]
2019/12/06 06:25:07 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
TestTokenReadCommand - 2019/12/06 06:25:07.500689 [INFO] serf: EventMemberJoin: Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6.dc1 127.0.0.1
TestTokenReadCommand - 2019/12/06 06:25:07.519489 [INFO] serf: EventMemberJoin: Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6 127.0.0.1
TestTokenReadCommand - 2019/12/06 06:25:07.522549 [INFO] consul: Adding LAN server Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6 (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestTokenReadCommand - 2019/12/06 06:25:07.523855 [INFO] consul: Handled member-join event for server "Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6.dc1" in area "wan"
TestTokenReadCommand - 2019/12/06 06:25:07.525586 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestTokenReadCommand - 2019/12/06 06:25:07.526344 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestTokenReadCommand - 2019/12/06 06:25:07.530827 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestTokenReadCommand - 2019/12/06 06:25:07.530998 [INFO] agent: started state syncer
2019/12/06 06:25:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:25:07 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
2019/12/06 06:25:09 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:25:09 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
TestTokenReadCommand - 2019/12/06 06:25:09.114756 [INFO] consul: cluster leadership acquired
TestTokenReadCommand - 2019/12/06 06:25:09.115509 [INFO] consul: New leader elected: Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6
TestTokenReadCommand - 2019/12/06 06:25:09.363993 [ERR] agent: failed to sync remote state: ACL not found
TestTokenReadCommand - 2019/12/06 06:25:09.570663 [ERR] agent: failed to sync remote state: ACL not found
TestTokenReadCommand - 2019/12/06 06:25:10.022749 [INFO] acl: initializing acls
TestTokenReadCommand - 2019/12/06 06:25:10.469135 [INFO] consul: Created ACL 'global-management' policy
TestTokenReadCommand - 2019/12/06 06:25:10.469257 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenReadCommand - 2019/12/06 06:25:10.677644 [INFO] acl: initializing acls
TestTokenReadCommand - 2019/12/06 06:25:10.677958 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenReadCommand - 2019/12/06 06:25:10.933643 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenReadCommand - 2019/12/06 06:25:11.425353 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenReadCommand - 2019/12/06 06:25:11.895044 [INFO] consul: Created ACL anonymous token from configuration
TestTokenReadCommand - 2019/12/06 06:25:11.896222 [INFO] serf: EventMemberUpdate: Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6
TestTokenReadCommand - 2019/12/06 06:25:11.897099 [INFO] serf: EventMemberUpdate: Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6.dc1
TestTokenReadCommand - 2019/12/06 06:25:12.825155 [INFO] consul: Created ACL anonymous token from configuration
TestTokenReadCommand - 2019/12/06 06:25:12.825397 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenReadCommand - 2019/12/06 06:25:12.826250 [INFO] serf: EventMemberUpdate: Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6
TestTokenReadCommand - 2019/12/06 06:25:12.826936 [INFO] serf: EventMemberUpdate: Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6.dc1
TestTokenReadCommand - 2019/12/06 06:25:14.989648 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenReadCommand - 2019/12/06 06:25:15.482259 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenReadCommand - 2019/12/06 06:25:15.482707 [DEBUG] consul: Skipping self join check for "Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6" since the cluster is too small
TestTokenReadCommand - 2019/12/06 06:25:15.482800 [INFO] consul: member 'Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6' joined, marking health alive
TestTokenReadCommand - 2019/12/06 06:25:15.891921 [DEBUG] consul: Skipping self join check for "Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6" since the cluster is too small
TestTokenReadCommand - 2019/12/06 06:25:15.892391 [DEBUG] consul: Skipping self join check for "Node 720ef0f4-7de2-a231-a046-f45e1dd5a8c6" since the cluster is too small
TestTokenReadCommand - 2019/12/06 06:25:16.347720 [DEBUG] http: Request PUT /v1/acl/token (437.69583ms) from=127.0.0.1:47268
TestTokenReadCommand - 2019/12/06 06:25:16.380920 [DEBUG] http: Request GET /v1/acl/token/ecdb545d-cd04-5461-553f-14640f3bc978 (1.900045ms) from=127.0.0.1:47272
TestTokenReadCommand - 2019/12/06 06:25:16.386552 [INFO] agent: Requesting shutdown
TestTokenReadCommand - 2019/12/06 06:25:16.386666 [INFO] consul: shutting down server
TestTokenReadCommand - 2019/12/06 06:25:16.386787 [WARN] serf: Shutdown without a Leave
TestTokenReadCommand - 2019/12/06 06:25:16.689345 [WARN] serf: Shutdown without a Leave
TestTokenReadCommand - 2019/12/06 06:25:16.831726 [INFO] manager: shutting down
TestTokenReadCommand - 2019/12/06 06:25:16.832761 [INFO] agent: consul server down
TestTokenReadCommand - 2019/12/06 06:25:16.833023 [INFO] agent: shutdown complete
TestTokenReadCommand - 2019/12/06 06:25:16.833259 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestTokenReadCommand - 2019/12/06 06:25:16.833524 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestTokenReadCommand - 2019/12/06 06:25:16.833783 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestTokenReadCommand - 2019/12/06 06:25:16.834728 [INFO] agent: Waiting for endpoints to shut down
TestTokenReadCommand - 2019/12/06 06:25:16.835063 [INFO] agent: Endpoints down
--- PASS: TestTokenReadCommand (12.72s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/read	14.062s
=== RUN   TestTokenUpdateCommand_noTabs
=== PAUSE TestTokenUpdateCommand_noTabs
=== RUN   TestTokenUpdateCommand
=== PAUSE TestTokenUpdateCommand
=== CONT  TestTokenUpdateCommand_noTabs
=== CONT  TestTokenUpdateCommand
--- PASS: TestTokenUpdateCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenUpdateCommand - 2019/12/06 06:25:04.658629 [WARN] agent: Node name "Node 1d06278b-c71c-9c68-310b-577ffeeee8b5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenUpdateCommand - 2019/12/06 06:25:04.868089 [DEBUG] tlsutil: Update with version 1
TestTokenUpdateCommand - 2019/12/06 06:25:04.891310 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:25:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1d06278b-c71c-9c68-310b-577ffeeee8b5 Address:127.0.0.1:16006}]
2019/12/06 06:25:07 [INFO]  raft: Node at 127.0.0.1:16006 [Follower] entering Follower state (Leader: "")
TestTokenUpdateCommand - 2019/12/06 06:25:07.488165 [INFO] serf: EventMemberJoin: Node 1d06278b-c71c-9c68-310b-577ffeeee8b5.dc1 127.0.0.1
TestTokenUpdateCommand - 2019/12/06 06:25:07.506169 [INFO] serf: EventMemberJoin: Node 1d06278b-c71c-9c68-310b-577ffeeee8b5 127.0.0.1
2019/12/06 06:25:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:25:07 [INFO]  raft: Node at 127.0.0.1:16006 [Candidate] entering Candidate state in term 2
TestTokenUpdateCommand - 2019/12/06 06:25:07.523765 [INFO] agent: Started DNS server 127.0.0.1:16001 (udp)
TestTokenUpdateCommand - 2019/12/06 06:25:07.540181 [INFO] agent: Started DNS server 127.0.0.1:16001 (tcp)
TestTokenUpdateCommand - 2019/12/06 06:25:07.543436 [INFO] agent: Started HTTP server on 127.0.0.1:16002 (tcp)
TestTokenUpdateCommand - 2019/12/06 06:25:07.543588 [INFO] agent: started state syncer
TestTokenUpdateCommand - 2019/12/06 06:25:07.544449 [INFO] consul: Handled member-join event for server "Node 1d06278b-c71c-9c68-310b-577ffeeee8b5.dc1" in area "wan"
TestTokenUpdateCommand - 2019/12/06 06:25:07.544928 [INFO] consul: Adding LAN server Node 1d06278b-c71c-9c68-310b-577ffeeee8b5 (Addr: tcp/127.0.0.1:16006) (DC: dc1)
2019/12/06 06:25:09 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:25:09 [INFO]  raft: Node at 127.0.0.1:16006 [Leader] entering Leader state
TestTokenUpdateCommand - 2019/12/06 06:25:09.115635 [INFO] consul: cluster leadership acquired
TestTokenUpdateCommand - 2019/12/06 06:25:09.116382 [INFO] consul: New leader elected: Node 1d06278b-c71c-9c68-310b-577ffeeee8b5
TestTokenUpdateCommand - 2019/12/06 06:25:09.120096 [ERR] agent: failed to sync remote state: ACL not found
TestTokenUpdateCommand - 2019/12/06 06:25:10.022703 [INFO] acl: initializing acls
TestTokenUpdateCommand - 2019/12/06 06:25:10.470324 [INFO] consul: Created ACL 'global-management' policy
TestTokenUpdateCommand - 2019/12/06 06:25:10.470404 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenUpdateCommand - 2019/12/06 06:25:10.599336 [ERR] agent: failed to sync remote state: ACL not found
TestTokenUpdateCommand - 2019/12/06 06:25:10.697140 [INFO] acl: initializing acls
TestTokenUpdateCommand - 2019/12/06 06:25:10.697292 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenUpdateCommand - 2019/12/06 06:25:11.424637 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenUpdateCommand - 2019/12/06 06:25:11.424981 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenUpdateCommand - 2019/12/06 06:25:12.383777 [INFO] consul: Created ACL anonymous token from configuration
TestTokenUpdateCommand - 2019/12/06 06:25:12.384352 [INFO] consul: Created ACL anonymous token from configuration
TestTokenUpdateCommand - 2019/12/06 06:25:12.384522 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenUpdateCommand - 2019/12/06 06:25:12.390874 [INFO] serf: EventMemberUpdate: Node 1d06278b-c71c-9c68-310b-577ffeeee8b5
TestTokenUpdateCommand - 2019/12/06 06:25:12.392187 [INFO] serf: EventMemberUpdate: Node 1d06278b-c71c-9c68-310b-577ffeeee8b5
TestTokenUpdateCommand - 2019/12/06 06:25:12.392819 [INFO] serf: EventMemberUpdate: Node 1d06278b-c71c-9c68-310b-577ffeeee8b5.dc1
TestTokenUpdateCommand - 2019/12/06 06:25:12.394285 [INFO] serf: EventMemberUpdate: Node 1d06278b-c71c-9c68-310b-577ffeeee8b5.dc1
TestTokenUpdateCommand - 2019/12/06 06:25:15.256841 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenUpdateCommand - 2019/12/06 06:25:15.257431 [DEBUG] consul: Skipping self join check for "Node 1d06278b-c71c-9c68-310b-577ffeeee8b5" since the cluster is too small
TestTokenUpdateCommand - 2019/12/06 06:25:15.257539 [INFO] consul: member 'Node 1d06278b-c71c-9c68-310b-577ffeeee8b5' joined, marking health alive
TestTokenUpdateCommand - 2019/12/06 06:25:15.259310 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenUpdateCommand - 2019/12/06 06:25:15.891731 [DEBUG] consul: Skipping self join check for "Node 1d06278b-c71c-9c68-310b-577ffeeee8b5" since the cluster is too small
TestTokenUpdateCommand - 2019/12/06 06:25:15.892264 [DEBUG] consul: Skipping self join check for "Node 1d06278b-c71c-9c68-310b-577ffeeee8b5" since the cluster is too small
TestTokenUpdateCommand - 2019/12/06 06:25:16.342160 [DEBUG] http: Request PUT /v1/acl/policy (438.099506ms) from=127.0.0.1:59580
TestTokenUpdateCommand - 2019/12/06 06:25:16.692757 [DEBUG] http: Request PUT /v1/acl/token (344.326996ms) from=127.0.0.1:59580
TestTokenUpdateCommand - 2019/12/06 06:25:17.407542 [DEBUG] http: Request PUT /v1/acl/create (709.048465ms) from=127.0.0.1:59580
TestTokenUpdateCommand - 2019/12/06 06:25:17.431085 [DEBUG] http: Request GET /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (4.582106ms) from=127.0.0.1:59588
TestTokenUpdateCommand - 2019/12/06 06:25:19.067846 [DEBUG] http: Request PUT /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (1.632720913s) from=127.0.0.1:59588
TestTokenUpdateCommand - 2019/12/06 06:25:19.074941 [DEBUG] http: Request GET /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (1.169694ms) from=127.0.0.1:59580
TestTokenUpdateCommand - 2019/12/06 06:25:19.084934 [DEBUG] http: Request GET /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (1.909044ms) from=127.0.0.1:59590
TestTokenUpdateCommand - 2019/12/06 06:25:19.684918 [DEBUG] http: Request PUT /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (597.225868ms) from=127.0.0.1:59590
TestTokenUpdateCommand - 2019/12/06 06:25:19.688543 [DEBUG] http: Request GET /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (831.686µs) from=127.0.0.1:59580
TestTokenUpdateCommand - 2019/12/06 06:25:19.696781 [DEBUG] http: Request GET /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (1.01669ms) from=127.0.0.1:59594
TestTokenUpdateCommand - 2019/12/06 06:25:21.158378 [DEBUG] http: Request PUT /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (1.458559202s) from=127.0.0.1:59594
TestTokenUpdateCommand - 2019/12/06 06:25:21.162375 [DEBUG] http: Request GET /v1/acl/token/330ae678-7d5c-e577-2e54-71dd2249aeb5 (920.355µs) from=127.0.0.1:59580
TestTokenUpdateCommand - 2019/12/06 06:25:21.166170 [DEBUG] http: Request GET /v1/acl/token/self (776.352µs) from=127.0.0.1:59580
TestTokenUpdateCommand - 2019/12/06 06:25:21.175239 [DEBUG] http: Request GET /v1/acl/token/86cb4c66-1a7a-0888-2f57-cf946ae6a408 (1.244028ms) from=127.0.0.1:59596
TestTokenUpdateCommand - 2019/12/06 06:25:21.676262 [DEBUG] http: Request PUT /v1/acl/token/86cb4c66-1a7a-0888-2f57-cf946ae6a408 (497.80656ms) from=127.0.0.1:59596
TestTokenUpdateCommand - 2019/12/06 06:25:21.682235 [DEBUG] http: Request GET /v1/acl/token/86cb4c66-1a7a-0888-2f57-cf946ae6a408 (1.010023ms) from=127.0.0.1:59580
TestTokenUpdateCommand - 2019/12/06 06:25:21.684325 [INFO] agent: Requesting shutdown
TestTokenUpdateCommand - 2019/12/06 06:25:21.684418 [INFO] consul: shutting down server
TestTokenUpdateCommand - 2019/12/06 06:25:21.684483 [WARN] serf: Shutdown without a Leave
TestTokenUpdateCommand - 2019/12/06 06:25:22.014282 [WARN] serf: Shutdown without a Leave
TestTokenUpdateCommand - 2019/12/06 06:25:22.172689 [INFO] manager: shutting down
TestTokenUpdateCommand - 2019/12/06 06:25:22.173703 [INFO] agent: consul server down
TestTokenUpdateCommand - 2019/12/06 06:25:22.173764 [INFO] agent: shutdown complete
TestTokenUpdateCommand - 2019/12/06 06:25:22.173822 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (tcp)
TestTokenUpdateCommand - 2019/12/06 06:25:22.173974 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (udp)
TestTokenUpdateCommand - 2019/12/06 06:25:22.174137 [INFO] agent: Stopping HTTP server 127.0.0.1:16002 (tcp)
TestTokenUpdateCommand - 2019/12/06 06:25:22.175320 [INFO] agent: Waiting for endpoints to shut down
TestTokenUpdateCommand - 2019/12/06 06:25:22.175478 [INFO] agent: Endpoints down
--- PASS: TestTokenUpdateCommand (17.65s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/update	18.090s
=== RUN   TestConfigFail
=== PAUSE TestConfigFail
=== RUN   TestRetryJoin
--- SKIP: TestRetryJoin (0.00s)
    agent_test.go:85: DM-skipped
=== RUN   TestRetryJoinFail
=== PAUSE TestRetryJoinFail
=== RUN   TestRetryJoinWanFail
=== PAUSE TestRetryJoinWanFail
=== RUN   TestProtectDataDir
=== PAUSE TestProtectDataDir
=== RUN   TestBadDataDirPermissions
=== PAUSE TestBadDataDirPermissions
=== CONT  TestConfigFail
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=
=== CONT  TestProtectDataDir
=== CONT  TestRetryJoinWanFail
=== CONT  TestRetryJoinFail
--- PASS: TestProtectDataDir (0.16s)
=== CONT  TestBadDataDirPermissions
--- PASS: TestBadDataDirPermissions (0.07s)
--- PASS: TestRetryJoinFail (1.53s)
--- PASS: TestRetryJoinWanFail (2.87s)
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=foo_some-other-arg
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise_0.0.0.0_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise_::_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise_[::]_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise-wan_0.0.0.0_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise-wan_::_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise-wan_[::]_-bind_10.0.0.1
--- PASS: TestConfigFail (9.27s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter= (4.96s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=foo_some-other-arg (0.45s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1 (0.47s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise_0.0.0.0_-bind_10.0.0.1 (0.50s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise_::_-bind_10.0.0.1 (0.50s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise_[::]_-bind_10.0.0.1 (0.42s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise-wan_0.0.0.0_-bind_10.0.0.1 (0.44s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise-wan_::_-bind_10.0.0.1 (0.47s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul017882933_-advertise-wan_[::]_-bind_10.0.0.1 (0.43s)
PASS
ok  	github.com/hashicorp/consul/command/agent	10.983s
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/catalog	0.033s
=== RUN   TestCatalogListDatacentersCommand_noTabs
=== PAUSE TestCatalogListDatacentersCommand_noTabs
=== RUN   TestCatalogListDatacentersCommand_Validation
=== PAUSE TestCatalogListDatacentersCommand_Validation
=== RUN   TestCatalogListDatacentersCommand
=== PAUSE TestCatalogListDatacentersCommand
=== CONT  TestCatalogListDatacentersCommand_noTabs
=== CONT  TestCatalogListDatacentersCommand
=== CONT  TestCatalogListDatacentersCommand_Validation
--- PASS: TestCatalogListDatacentersCommand_noTabs (0.00s)
--- PASS: TestCatalogListDatacentersCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListDatacentersCommand - 2019/12/06 06:27:42.955361 [WARN] agent: Node name "Node 85f4b1f7-cb89-1214-f3a0-3465d7f45eed" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListDatacentersCommand - 2019/12/06 06:27:42.956248 [DEBUG] tlsutil: Update with version 1
TestCatalogListDatacentersCommand - 2019/12/06 06:27:42.965283 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:27:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:85f4b1f7-cb89-1214-f3a0-3465d7f45eed Address:127.0.0.1:26506}]
2019/12/06 06:27:44 [INFO]  raft: Node at 127.0.0.1:26506 [Follower] entering Follower state (Leader: "")
TestCatalogListDatacentersCommand - 2019/12/06 06:27:44.733133 [INFO] serf: EventMemberJoin: Node 85f4b1f7-cb89-1214-f3a0-3465d7f45eed.dc1 127.0.0.1
TestCatalogListDatacentersCommand - 2019/12/06 06:27:44.738557 [INFO] serf: EventMemberJoin: Node 85f4b1f7-cb89-1214-f3a0-3465d7f45eed 127.0.0.1
TestCatalogListDatacentersCommand - 2019/12/06 06:27:44.740361 [INFO] consul: Handled member-join event for server "Node 85f4b1f7-cb89-1214-f3a0-3465d7f45eed.dc1" in area "wan"
TestCatalogListDatacentersCommand - 2019/12/06 06:27:44.740799 [INFO] consul: Adding LAN server Node 85f4b1f7-cb89-1214-f3a0-3465d7f45eed (Addr: tcp/127.0.0.1:26506) (DC: dc1)
TestCatalogListDatacentersCommand - 2019/12/06 06:27:44.742746 [INFO] agent: Started DNS server 127.0.0.1:26501 (udp)
TestCatalogListDatacentersCommand - 2019/12/06 06:27:44.742861 [INFO] agent: Started DNS server 127.0.0.1:26501 (tcp)
TestCatalogListDatacentersCommand - 2019/12/06 06:27:44.748205 [INFO] agent: Started HTTP server on 127.0.0.1:26502 (tcp)
TestCatalogListDatacentersCommand - 2019/12/06 06:27:44.748476 [INFO] agent: started state syncer
2019/12/06 06:27:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:27:44 [INFO]  raft: Node at 127.0.0.1:26506 [Candidate] entering Candidate state in term 2
2019/12/06 06:27:46 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:27:46 [INFO]  raft: Node at 127.0.0.1:26506 [Leader] entering Leader state
TestCatalogListDatacentersCommand - 2019/12/06 06:27:46.719787 [INFO] consul: cluster leadership acquired
TestCatalogListDatacentersCommand - 2019/12/06 06:27:46.720274 [INFO] consul: New leader elected: Node 85f4b1f7-cb89-1214-f3a0-3465d7f45eed
TestCatalogListDatacentersCommand - 2019/12/06 06:27:46.906201 [DEBUG] http: Request GET /v1/catalog/datacenters (5.330457ms) from=127.0.0.1:48888
TestCatalogListDatacentersCommand - 2019/12/06 06:27:46.908212 [INFO] agent: Requesting shutdown
TestCatalogListDatacentersCommand - 2019/12/06 06:27:46.908295 [INFO] consul: shutting down server
TestCatalogListDatacentersCommand - 2019/12/06 06:27:46.908339 [WARN] serf: Shutdown without a Leave
TestCatalogListDatacentersCommand - 2019/12/06 06:27:46.908740 [ERR] agent: failed to sync remote state: No cluster leader
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.312077 [WARN] serf: Shutdown without a Leave
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.561052 [INFO] manager: shutting down
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.666545 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.667068 [INFO] agent: consul server down
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.667127 [INFO] agent: shutdown complete
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.667178 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (tcp)
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.667071 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.667362 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.667463 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (udp)
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.667632 [INFO] agent: Stopping HTTP server 127.0.0.1:26502 (tcp)
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.668101 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListDatacentersCommand - 2019/12/06 06:27:47.668180 [INFO] agent: Endpoints down
--- PASS: TestCatalogListDatacentersCommand (4.87s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/dc	5.265s
=== RUN   TestCatalogListNodesCommand_noTabs
=== PAUSE TestCatalogListNodesCommand_noTabs
=== RUN   TestCatalogListNodesCommand_Validation
=== PAUSE TestCatalogListNodesCommand_Validation
=== RUN   TestCatalogListNodesCommand
=== PAUSE TestCatalogListNodesCommand
=== CONT  TestCatalogListNodesCommand_noTabs
--- PASS: TestCatalogListNodesCommand_noTabs (0.00s)
=== CONT  TestCatalogListNodesCommand
=== CONT  TestCatalogListNodesCommand_Validation
--- PASS: TestCatalogListNodesCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListNodesCommand - 2019/12/06 06:27:41.011969 [WARN] agent: Node name "Node 6bf44986-5c64-2e06-6120-2cb73213079b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListNodesCommand - 2019/12/06 06:27:41.091398 [DEBUG] tlsutil: Update with version 1
TestCatalogListNodesCommand - 2019/12/06 06:27:41.103778 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:27:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6bf44986-5c64-2e06-6120-2cb73213079b Address:127.0.0.1:40006}]
2019/12/06 06:27:42 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
2019/12/06 06:27:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:27:42 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
TestCatalogListNodesCommand - 2019/12/06 06:27:42.242026 [WARN] raft: Unable to get address for server id 6bf44986-5c64-2e06-6120-2cb73213079b, using fallback address 127.0.0.1:40006: Could not find address for server id 6bf44986-5c64-2e06-6120-2cb73213079b
TestCatalogListNodesCommand - 2019/12/06 06:27:42.402027 [INFO] serf: EventMemberJoin: Node 6bf44986-5c64-2e06-6120-2cb73213079b.dc1 127.0.0.1
TestCatalogListNodesCommand - 2019/12/06 06:27:42.408250 [INFO] serf: EventMemberJoin: Node 6bf44986-5c64-2e06-6120-2cb73213079b 127.0.0.1
TestCatalogListNodesCommand - 2019/12/06 06:27:42.409698 [INFO] consul: Handled member-join event for server "Node 6bf44986-5c64-2e06-6120-2cb73213079b.dc1" in area "wan"
TestCatalogListNodesCommand - 2019/12/06 06:27:42.413281 [INFO] consul: Adding LAN server Node 6bf44986-5c64-2e06-6120-2cb73213079b (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestCatalogListNodesCommand - 2019/12/06 06:27:42.415004 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestCatalogListNodesCommand - 2019/12/06 06:27:42.415076 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestCatalogListNodesCommand - 2019/12/06 06:27:42.417512 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestCatalogListNodesCommand - 2019/12/06 06:27:42.417645 [INFO] agent: started state syncer
2019/12/06 06:27:42 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:27:42 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
TestCatalogListNodesCommand - 2019/12/06 06:27:42.867165 [INFO] consul: cluster leadership acquired
TestCatalogListNodesCommand - 2019/12/06 06:27:42.867780 [INFO] consul: New leader elected: Node 6bf44986-5c64-2e06-6120-2cb73213079b
TestCatalogListNodesCommand - 2019/12/06 06:27:43.259075 [INFO] agent: Synced node info
TestCatalogListNodesCommand - 2019/12/06 06:27:43.259244 [DEBUG] agent: Node info in sync
TestCatalogListNodesCommand - 2019/12/06 06:27:44.778179 [DEBUG] agent: Node info in sync
TestCatalogListNodesCommand - 2019/12/06 06:27:44.867265 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogListNodesCommand - 2019/12/06 06:27:44.867763 [DEBUG] consul: Skipping self join check for "Node 6bf44986-5c64-2e06-6120-2cb73213079b" since the cluster is too small
TestCatalogListNodesCommand - 2019/12/06 06:27:44.867926 [INFO] consul: member 'Node 6bf44986-5c64-2e06-6120-2cb73213079b' joined, marking health alive
=== RUN   TestCatalogListNodesCommand/simple
TestCatalogListNodesCommand - 2019/12/06 06:27:45.179394 [DEBUG] http: Request GET /v1/catalog/nodes (4.856113ms) from=127.0.0.1:56864
=== RUN   TestCatalogListNodesCommand/detailed
TestCatalogListNodesCommand - 2019/12/06 06:27:45.191198 [DEBUG] http: Request GET /v1/catalog/nodes (1.423034ms) from=127.0.0.1:56866
=== RUN   TestCatalogListNodesCommand/node-meta
TestCatalogListNodesCommand - 2019/12/06 06:27:45.200736 [DEBUG] http: Request GET /v1/catalog/nodes?node-meta=foo%3Abar (1.199028ms) from=127.0.0.1:56868
=== RUN   TestCatalogListNodesCommand/filter
TestCatalogListNodesCommand - 2019/12/06 06:27:45.209558 [DEBUG] http: Request GET /v1/catalog/nodes?filter=Meta.foo+%3D%3D+bar (2.833066ms) from=127.0.0.1:56870
=== RUN   TestCatalogListNodesCommand/near
TestCatalogListNodesCommand - 2019/12/06 06:27:45.216209 [DEBUG] http: Request GET /v1/catalog/nodes?near=_agent (1.396365ms) from=127.0.0.1:56872
=== RUN   TestCatalogListNodesCommand/service_present
TestCatalogListNodesCommand - 2019/12/06 06:27:45.227211 [DEBUG] http: Request GET /v1/catalog/service/consul (3.855089ms) from=127.0.0.1:56874
=== RUN   TestCatalogListNodesCommand/service_missing
TestCatalogListNodesCommand - 2019/12/06 06:27:45.237731 [DEBUG] http: Request GET /v1/catalog/service/this-service-will-literally-never-exist (1.847043ms) from=127.0.0.1:56876
TestCatalogListNodesCommand - 2019/12/06 06:27:45.239362 [INFO] agent: Requesting shutdown
TestCatalogListNodesCommand - 2019/12/06 06:27:45.239442 [INFO] consul: shutting down server
TestCatalogListNodesCommand - 2019/12/06 06:27:45.239495 [WARN] serf: Shutdown without a Leave
TestCatalogListNodesCommand - 2019/12/06 06:27:45.358120 [WARN] serf: Shutdown without a Leave
TestCatalogListNodesCommand - 2019/12/06 06:27:45.467444 [INFO] manager: shutting down
TestCatalogListNodesCommand - 2019/12/06 06:27:45.468004 [INFO] agent: consul server down
TestCatalogListNodesCommand - 2019/12/06 06:27:45.468108 [INFO] agent: shutdown complete
TestCatalogListNodesCommand - 2019/12/06 06:27:45.468186 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestCatalogListNodesCommand - 2019/12/06 06:27:45.468392 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestCatalogListNodesCommand - 2019/12/06 06:27:45.468586 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestCatalogListNodesCommand - 2019/12/06 06:27:45.473661 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListNodesCommand - 2019/12/06 06:27:45.473790 [INFO] agent: Endpoints down
--- PASS: TestCatalogListNodesCommand (4.57s)
    --- PASS: TestCatalogListNodesCommand/simple (0.02s)
    --- PASS: TestCatalogListNodesCommand/detailed (0.01s)
    --- PASS: TestCatalogListNodesCommand/node-meta (0.01s)
    --- PASS: TestCatalogListNodesCommand/filter (0.01s)
    --- PASS: TestCatalogListNodesCommand/near (0.01s)
    --- PASS: TestCatalogListNodesCommand/service_present (0.01s)
    --- PASS: TestCatalogListNodesCommand/service_missing (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/nodes	5.341s
=== RUN   TestCatalogListServicesCommand_noTabs
=== PAUSE TestCatalogListServicesCommand_noTabs
=== RUN   TestCatalogListServicesCommand_Validation
=== PAUSE TestCatalogListServicesCommand_Validation
=== RUN   TestCatalogListServicesCommand
=== PAUSE TestCatalogListServicesCommand
=== CONT  TestCatalogListServicesCommand_noTabs
--- PASS: TestCatalogListServicesCommand_noTabs (0.00s)
=== CONT  TestCatalogListServicesCommand
=== CONT  TestCatalogListServicesCommand_Validation
--- PASS: TestCatalogListServicesCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListServicesCommand - 2019/12/06 06:27:43.152414 [WARN] agent: Node name "Node 0b44f072-bef7-9a0d-eb20-eac50920102b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListServicesCommand - 2019/12/06 06:27:43.153025 [DEBUG] tlsutil: Update with version 1
TestCatalogListServicesCommand - 2019/12/06 06:27:43.170841 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:27:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0b44f072-bef7-9a0d-eb20-eac50920102b Address:127.0.0.1:20506}]
2019/12/06 06:27:44 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestCatalogListServicesCommand - 2019/12/06 06:27:44.733632 [INFO] serf: EventMemberJoin: Node 0b44f072-bef7-9a0d-eb20-eac50920102b.dc1 127.0.0.1
TestCatalogListServicesCommand - 2019/12/06 06:27:44.742221 [INFO] serf: EventMemberJoin: Node 0b44f072-bef7-9a0d-eb20-eac50920102b 127.0.0.1
TestCatalogListServicesCommand - 2019/12/06 06:27:44.743761 [INFO] consul: Adding LAN server Node 0b44f072-bef7-9a0d-eb20-eac50920102b (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestCatalogListServicesCommand - 2019/12/06 06:27:44.744046 [INFO] consul: Handled member-join event for server "Node 0b44f072-bef7-9a0d-eb20-eac50920102b.dc1" in area "wan"
TestCatalogListServicesCommand - 2019/12/06 06:27:44.745748 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestCatalogListServicesCommand - 2019/12/06 06:27:44.745859 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestCatalogListServicesCommand - 2019/12/06 06:27:44.748645 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestCatalogListServicesCommand - 2019/12/06 06:27:44.748774 [INFO] agent: started state syncer
2019/12/06 06:27:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:27:44 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/06 06:27:46 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:27:46 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestCatalogListServicesCommand - 2019/12/06 06:27:46.717056 [INFO] consul: cluster leadership acquired
TestCatalogListServicesCommand - 2019/12/06 06:27:46.717654 [INFO] consul: New leader elected: Node 0b44f072-bef7-9a0d-eb20-eac50920102b
TestCatalogListServicesCommand - 2019/12/06 06:27:47.309398 [INFO] agent: Synced node info
TestCatalogListServicesCommand - 2019/12/06 06:27:49.400594 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogListServicesCommand - 2019/12/06 06:27:49.401037 [DEBUG] consul: Skipping self join check for "Node 0b44f072-bef7-9a0d-eb20-eac50920102b" since the cluster is too small
TestCatalogListServicesCommand - 2019/12/06 06:27:49.401169 [INFO] consul: member 'Node 0b44f072-bef7-9a0d-eb20-eac50920102b' joined, marking health alive
TestCatalogListServicesCommand - 2019/12/06 06:27:49.653468 [DEBUG] agent: Node info in sync
TestCatalogListServicesCommand - 2019/12/06 06:27:49.653571 [DEBUG] agent: Node info in sync
TestCatalogListServicesCommand - 2019/12/06 06:27:49.741882 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogListServicesCommand - 2019/12/06 06:27:50.217897 [INFO] agent: Synced service "testing"
TestCatalogListServicesCommand - 2019/12/06 06:27:50.217965 [DEBUG] agent: Node info in sync
TestCatalogListServicesCommand - 2019/12/06 06:27:50.218073 [DEBUG] http: Request PUT /v1/agent/service/register (457.26862ms) from=127.0.0.1:48276
TestCatalogListServicesCommand - 2019/12/06 06:27:50.218132 [DEBUG] agent: Service "testing" in sync
TestCatalogListServicesCommand - 2019/12/06 06:27:50.218396 [DEBUG] agent: Node info in sync
=== RUN   TestCatalogListServicesCommand/simple
TestCatalogListServicesCommand - 2019/12/06 06:27:50.225600 [DEBUG] http: Request GET /v1/catalog/services (1.750374ms) from=127.0.0.1:48278
=== RUN   TestCatalogListServicesCommand/tags
TestCatalogListServicesCommand - 2019/12/06 06:27:50.233287 [DEBUG] http: Request GET /v1/catalog/services (1.730373ms) from=127.0.0.1:48280
=== RUN   TestCatalogListServicesCommand/node_missing
TestCatalogListServicesCommand - 2019/12/06 06:27:50.246636 [DEBUG] http: Request GET /v1/catalog/node/not-a-real-node (6.199811ms) from=127.0.0.1:48282
=== RUN   TestCatalogListServicesCommand/node_present
TestCatalogListServicesCommand - 2019/12/06 06:27:50.259029 [DEBUG] http: Request GET /v1/catalog/node/Node%200b44f072-bef7-9a0d-eb20-eac50920102b (2.797398ms) from=127.0.0.1:48284
=== RUN   TestCatalogListServicesCommand/node-meta
TestCatalogListServicesCommand - 2019/12/06 06:27:50.268835 [DEBUG] http: Request GET /v1/catalog/services?node-meta=foo%3Abar (1.88071ms) from=127.0.0.1:48286
TestCatalogListServicesCommand - 2019/12/06 06:27:50.270583 [INFO] agent: Requesting shutdown
TestCatalogListServicesCommand - 2019/12/06 06:27:50.270757 [INFO] consul: shutting down server
TestCatalogListServicesCommand - 2019/12/06 06:27:50.270892 [WARN] serf: Shutdown without a Leave
TestCatalogListServicesCommand - 2019/12/06 06:27:50.333206 [WARN] serf: Shutdown without a Leave
TestCatalogListServicesCommand - 2019/12/06 06:27:50.441819 [INFO] manager: shutting down
TestCatalogListServicesCommand - 2019/12/06 06:27:50.444083 [INFO] agent: consul server down
TestCatalogListServicesCommand - 2019/12/06 06:27:50.444676 [INFO] agent: shutdown complete
TestCatalogListServicesCommand - 2019/12/06 06:27:50.444793 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestCatalogListServicesCommand - 2019/12/06 06:27:50.445052 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestCatalogListServicesCommand - 2019/12/06 06:27:50.445258 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestCatalogListServicesCommand - 2019/12/06 06:27:50.446455 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListServicesCommand - 2019/12/06 06:27:50.446602 [INFO] agent: Endpoints down
--- PASS: TestCatalogListServicesCommand (7.44s)
    --- PASS: TestCatalogListServicesCommand/simple (0.01s)
    --- PASS: TestCatalogListServicesCommand/tags (0.01s)
    --- PASS: TestCatalogListServicesCommand/node_missing (0.01s)
    --- PASS: TestCatalogListServicesCommand/node_present (0.01s)
    --- PASS: TestCatalogListServicesCommand/node-meta (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/services	7.885s
?   	github.com/hashicorp/consul/command/config	[no test files]
=== RUN   TestConfigDelete_noTabs
=== PAUSE TestConfigDelete_noTabs
=== RUN   TestConfigDelete
=== PAUSE TestConfigDelete
=== RUN   TestConfigDelete_InvalidArgs
=== PAUSE TestConfigDelete_InvalidArgs
=== CONT  TestConfigDelete_noTabs
=== CONT  TestConfigDelete_InvalidArgs
=== CONT  TestConfigDelete
--- PASS: TestConfigDelete_noTabs (0.00s)
=== RUN   TestConfigDelete_InvalidArgs/no_kind
=== RUN   TestConfigDelete_InvalidArgs/no_name
--- PASS: TestConfigDelete_InvalidArgs (0.01s)
    --- PASS: TestConfigDelete_InvalidArgs/no_kind (0.00s)
    --- PASS: TestConfigDelete_InvalidArgs/no_name (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConfigDelete - 2019/12/06 06:27:43.157110 [WARN] agent: Node name "Node 4d0bfae0-f580-0dbb-1365-74a1cea477cc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfigDelete - 2019/12/06 06:27:43.159835 [DEBUG] tlsutil: Update with version 1
TestConfigDelete - 2019/12/06 06:27:43.167050 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:27:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4d0bfae0-f580-0dbb-1365-74a1cea477cc Address:127.0.0.1:28006}]
2019/12/06 06:27:44 [INFO]  raft: Node at 127.0.0.1:28006 [Follower] entering Follower state (Leader: "")
2019/12/06 06:27:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:27:44 [INFO]  raft: Node at 127.0.0.1:28006 [Candidate] entering Candidate state in term 2
TestConfigDelete - 2019/12/06 06:27:44.733972 [INFO] serf: EventMemberJoin: Node 4d0bfae0-f580-0dbb-1365-74a1cea477cc.dc1 127.0.0.1
TestConfigDelete - 2019/12/06 06:27:44.738700 [INFO] serf: EventMemberJoin: Node 4d0bfae0-f580-0dbb-1365-74a1cea477cc 127.0.0.1
TestConfigDelete - 2019/12/06 06:27:44.741186 [INFO] consul: Handled member-join event for server "Node 4d0bfae0-f580-0dbb-1365-74a1cea477cc.dc1" in area "wan"
TestConfigDelete - 2019/12/06 06:27:44.741661 [INFO] consul: Adding LAN server Node 4d0bfae0-f580-0dbb-1365-74a1cea477cc (Addr: tcp/127.0.0.1:28006) (DC: dc1)
TestConfigDelete - 2019/12/06 06:27:44.742081 [INFO] agent: Started DNS server 127.0.0.1:28001 (udp)
TestConfigDelete - 2019/12/06 06:27:44.742508 [INFO] agent: Started DNS server 127.0.0.1:28001 (tcp)
TestConfigDelete - 2019/12/06 06:27:44.746263 [INFO] agent: Started HTTP server on 127.0.0.1:28002 (tcp)
TestConfigDelete - 2019/12/06 06:27:44.746431 [INFO] agent: started state syncer
2019/12/06 06:27:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:27:45 [INFO]  raft: Node at 127.0.0.1:28006 [Leader] entering Leader state
TestConfigDelete - 2019/12/06 06:27:45.694506 [INFO] consul: cluster leadership acquired
TestConfigDelete - 2019/12/06 06:27:45.695219 [INFO] consul: New leader elected: Node 4d0bfae0-f580-0dbb-1365-74a1cea477cc
TestConfigDelete - 2019/12/06 06:27:47.187775 [DEBUG] http: Request PUT /v1/config (1.442733176s) from=127.0.0.1:57860
TestConfigDelete - 2019/12/06 06:27:47.189548 [INFO] agent: Synced node info
TestConfigDelete - 2019/12/06 06:27:47.190667 [DEBUG] agent: Node info in sync
TestConfigDelete - 2019/12/06 06:27:48.186554 [DEBUG] http: Request DELETE /v1/config/service-defaults/web (983.475842ms) from=127.0.0.1:57864
TestConfigDelete - 2019/12/06 06:27:48.188846 [ERR] http: Request GET /v1/config/service-defaults/web, error: Config entry not found for "service-defaults" / "web" from=127.0.0.1:57860
TestConfigDelete - 2019/12/06 06:27:48.190662 [DEBUG] http: Request GET /v1/config/service-defaults/web (2.106382ms) from=127.0.0.1:57860
TestConfigDelete - 2019/12/06 06:27:48.192020 [INFO] agent: Requesting shutdown
TestConfigDelete - 2019/12/06 06:27:48.192128 [INFO] consul: shutting down server
TestConfigDelete - 2019/12/06 06:27:48.192177 [WARN] serf: Shutdown without a Leave
TestConfigDelete - 2019/12/06 06:27:48.399820 [WARN] serf: Shutdown without a Leave
TestConfigDelete - 2019/12/06 06:27:48.566645 [INFO] manager: shutting down
TestConfigDelete - 2019/12/06 06:27:48.567194 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestConfigDelete - 2019/12/06 06:27:48.567714 [INFO] agent: consul server down
TestConfigDelete - 2019/12/06 06:27:48.567775 [INFO] agent: shutdown complete
TestConfigDelete - 2019/12/06 06:27:48.567837 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (tcp)
TestConfigDelete - 2019/12/06 06:27:48.568225 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (udp)
TestConfigDelete - 2019/12/06 06:27:48.568547 [INFO] agent: Stopping HTTP server 127.0.0.1:28002 (tcp)
TestConfigDelete - 2019/12/06 06:27:48.569308 [INFO] agent: Waiting for endpoints to shut down
TestConfigDelete - 2019/12/06 06:27:48.569395 [INFO] agent: Endpoints down
--- PASS: TestConfigDelete (5.66s)
PASS
ok  	github.com/hashicorp/consul/command/config/delete	6.138s
=== RUN   TestConfigList_noTabs
=== PAUSE TestConfigList_noTabs
=== RUN   TestConfigList
WARNING: bootstrap = true: do not enable unless necessary
TestConfigList - 2019/12/06 06:31:44.456395 [WARN] agent: Node name "Node d02662b1-3ed0-90bb-fbb8-ff5f87a38bde" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfigList - 2019/12/06 06:31:44.596865 [DEBUG] tlsutil: Update with version 1
TestConfigList - 2019/12/06 06:31:44.611874 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:31:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d02662b1-3ed0-90bb-fbb8-ff5f87a38bde Address:127.0.0.1:13006}]
2019/12/06 06:31:47 [INFO]  raft: Node at 127.0.0.1:13006 [Follower] entering Follower state (Leader: "")
TestConfigList - 2019/12/06 06:31:47.322905 [INFO] serf: EventMemberJoin: Node d02662b1-3ed0-90bb-fbb8-ff5f87a38bde.dc1 127.0.0.1
TestConfigList - 2019/12/06 06:31:47.326084 [INFO] serf: EventMemberJoin: Node d02662b1-3ed0-90bb-fbb8-ff5f87a38bde 127.0.0.1
TestConfigList - 2019/12/06 06:31:47.326888 [INFO] consul: Adding LAN server Node d02662b1-3ed0-90bb-fbb8-ff5f87a38bde (Addr: tcp/127.0.0.1:13006) (DC: dc1)
TestConfigList - 2019/12/06 06:31:47.327035 [INFO] consul: Handled member-join event for server "Node d02662b1-3ed0-90bb-fbb8-ff5f87a38bde.dc1" in area "wan"
TestConfigList - 2019/12/06 06:31:47.327475 [INFO] agent: Started DNS server 127.0.0.1:13001 (tcp)
TestConfigList - 2019/12/06 06:31:47.327542 [INFO] agent: Started DNS server 127.0.0.1:13001 (udp)
TestConfigList - 2019/12/06 06:31:47.330061 [INFO] agent: Started HTTP server on 127.0.0.1:13002 (tcp)
TestConfigList - 2019/12/06 06:31:47.330218 [INFO] agent: started state syncer
2019/12/06 06:31:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:31:47 [INFO]  raft: Node at 127.0.0.1:13006 [Candidate] entering Candidate state in term 2
2019/12/06 06:31:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:31:50 [INFO]  raft: Node at 127.0.0.1:13006 [Leader] entering Leader state
TestConfigList - 2019/12/06 06:31:50.363034 [INFO] consul: cluster leadership acquired
TestConfigList - 2019/12/06 06:31:50.363638 [INFO] consul: New leader elected: Node d02662b1-3ed0-90bb-fbb8-ff5f87a38bde
TestConfigList - 2019/12/06 06:31:51.105998 [INFO] agent: Synced node info
TestConfigList - 2019/12/06 06:31:51.110321 [DEBUG] http: Request PUT /v1/config (714.814608ms) from=127.0.0.1:58718
TestConfigList - 2019/12/06 06:31:52.534242 [DEBUG] http: Request PUT /v1/config (1.150690402s) from=127.0.0.1:58718
TestConfigList - 2019/12/06 06:31:53.240385 [DEBUG] http: Request PUT /v1/config (698.668233ms) from=127.0.0.1:58718
TestConfigList - 2019/12/06 06:31:53.246093 [DEBUG] http: Request GET /v1/config/service-defaults (2.342055ms) from=127.0.0.1:58730
TestConfigList - 2019/12/06 06:31:53.248713 [INFO] agent: Requesting shutdown
TestConfigList - 2019/12/06 06:31:53.248801 [INFO] consul: shutting down server
TestConfigList - 2019/12/06 06:31:53.248846 [WARN] serf: Shutdown without a Leave
TestConfigList - 2019/12/06 06:31:53.381523 [DEBUG] agent: Node info in sync
TestConfigList - 2019/12/06 06:31:53.381619 [DEBUG] agent: Node info in sync
TestConfigList - 2019/12/06 06:31:53.487198 [WARN] serf: Shutdown without a Leave
TestConfigList - 2019/12/06 06:31:53.695606 [INFO] manager: shutting down
TestConfigList - 2019/12/06 06:31:53.855276 [INFO] agent: consul server down
TestConfigList - 2019/12/06 06:31:53.855405 [INFO] agent: shutdown complete
TestConfigList - 2019/12/06 06:31:53.855511 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (tcp)
TestConfigList - 2019/12/06 06:31:53.855740 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (udp)
TestConfigList - 2019/12/06 06:31:53.855890 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestConfigList - 2019/12/06 06:31:53.856201 [INFO] agent: Stopping HTTP server 127.0.0.1:13002 (tcp)
TestConfigList - 2019/12/06 06:31:53.857274 [INFO] agent: Waiting for endpoints to shut down
TestConfigList - 2019/12/06 06:31:53.857338 [INFO] agent: Endpoints down
--- PASS: TestConfigList (9.58s)
=== RUN   TestConfigList_InvalidArgs
=== PAUSE TestConfigList_InvalidArgs
=== CONT  TestConfigList_noTabs
=== CONT  TestConfigList_InvalidArgs
=== RUN   TestConfigList_InvalidArgs/no_kind
--- PASS: TestConfigList_noTabs (0.00s)
--- PASS: TestConfigList_InvalidArgs (0.00s)
    --- PASS: TestConfigList_InvalidArgs/no_kind (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/config/list	11.030s
=== RUN   TestConfigRead_noTabs
=== PAUSE TestConfigRead_noTabs
=== RUN   TestConfigRead
=== PAUSE TestConfigRead
=== RUN   TestConfigRead_InvalidArgs
=== PAUSE TestConfigRead_InvalidArgs
=== CONT  TestConfigRead_noTabs
=== CONT  TestConfigRead
=== CONT  TestConfigRead_InvalidArgs
=== RUN   TestConfigRead_InvalidArgs/no_kind
--- PASS: TestConfigRead_noTabs (0.00s)
=== RUN   TestConfigRead_InvalidArgs/no_name
--- PASS: TestConfigRead_InvalidArgs (0.01s)
    --- PASS: TestConfigRead_InvalidArgs/no_kind (0.00s)
    --- PASS: TestConfigRead_InvalidArgs/no_name (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConfigRead - 2019/12/06 06:31:44.531496 [WARN] agent: Node name "Node b221c612-15e7-5b39-b0ac-b8ca082db6a2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfigRead - 2019/12/06 06:31:44.600207 [DEBUG] tlsutil: Update with version 1
TestConfigRead - 2019/12/06 06:31:44.611807 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:31:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b221c612-15e7-5b39-b0ac-b8ca082db6a2 Address:127.0.0.1:31006}]
2019/12/06 06:31:48 [INFO]  raft: Node at 127.0.0.1:31006 [Follower] entering Follower state (Leader: "")
TestConfigRead - 2019/12/06 06:31:48.793615 [INFO] serf: EventMemberJoin: Node b221c612-15e7-5b39-b0ac-b8ca082db6a2.dc1 127.0.0.1
TestConfigRead - 2019/12/06 06:31:48.797560 [INFO] serf: EventMemberJoin: Node b221c612-15e7-5b39-b0ac-b8ca082db6a2 127.0.0.1
TestConfigRead - 2019/12/06 06:31:48.798840 [INFO] consul: Adding LAN server Node b221c612-15e7-5b39-b0ac-b8ca082db6a2 (Addr: tcp/127.0.0.1:31006) (DC: dc1)
TestConfigRead - 2019/12/06 06:31:48.799546 [INFO] consul: Handled member-join event for server "Node b221c612-15e7-5b39-b0ac-b8ca082db6a2.dc1" in area "wan"
TestConfigRead - 2019/12/06 06:31:48.799627 [INFO] agent: Started DNS server 127.0.0.1:31001 (udp)
TestConfigRead - 2019/12/06 06:31:48.800035 [INFO] agent: Started DNS server 127.0.0.1:31001 (tcp)
TestConfigRead - 2019/12/06 06:31:48.802712 [INFO] agent: Started HTTP server on 127.0.0.1:31002 (tcp)
TestConfigRead - 2019/12/06 06:31:48.802798 [INFO] agent: started state syncer
2019/12/06 06:31:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:31:48 [INFO]  raft: Node at 127.0.0.1:31006 [Candidate] entering Candidate state in term 2
2019/12/06 06:31:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:31:50 [INFO]  raft: Node at 127.0.0.1:31006 [Leader] entering Leader state
TestConfigRead - 2019/12/06 06:31:50.900359 [INFO] consul: cluster leadership acquired
TestConfigRead - 2019/12/06 06:31:50.900873 [INFO] consul: New leader elected: Node b221c612-15e7-5b39-b0ac-b8ca082db6a2
TestConfigRead - 2019/12/06 06:31:52.063821 [INFO] agent: Synced node info
TestConfigRead - 2019/12/06 06:31:52.070490 [DEBUG] http: Request PUT /v1/config (774.465661ms) from=127.0.0.1:60428
TestConfigRead - 2019/12/06 06:31:52.087295 [DEBUG] http: Request GET /v1/config/service-defaults/web (1.453367ms) from=127.0.0.1:60432
TestConfigRead - 2019/12/06 06:31:52.090019 [INFO] agent: Requesting shutdown
TestConfigRead - 2019/12/06 06:31:52.090266 [INFO] consul: shutting down server
TestConfigRead - 2019/12/06 06:31:52.090403 [WARN] serf: Shutdown without a Leave
TestConfigRead - 2019/12/06 06:31:52.437175 [WARN] serf: Shutdown without a Leave
TestConfigRead - 2019/12/06 06:31:52.637274 [INFO] manager: shutting down
TestConfigRead - 2019/12/06 06:31:52.654113 [ERR] agent: failed to sync remote state: No cluster leader
TestConfigRead - 2019/12/06 06:31:53.237477 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestConfigRead - 2019/12/06 06:31:53.237830 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestConfigRead - 2019/12/06 06:31:53.238554 [INFO] agent: consul server down
TestConfigRead - 2019/12/06 06:31:53.238620 [INFO] agent: shutdown complete
TestConfigRead - 2019/12/06 06:31:53.238681 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (tcp)
TestConfigRead - 2019/12/06 06:31:53.238837 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (udp)
TestConfigRead - 2019/12/06 06:31:53.239008 [INFO] agent: Stopping HTTP server 127.0.0.1:31002 (tcp)
TestConfigRead - 2019/12/06 06:31:53.240170 [INFO] agent: Waiting for endpoints to shut down
TestConfigRead - 2019/12/06 06:31:53.240287 [INFO] agent: Endpoints down
--- PASS: TestConfigRead (8.87s)
PASS
ok  	github.com/hashicorp/consul/command/config/read	9.100s
=== RUN   TestConfigWrite_noTabs
=== PAUSE TestConfigWrite_noTabs
=== RUN   TestConfigWrite
=== PAUSE TestConfigWrite
=== CONT  TestConfigWrite
=== CONT  TestConfigWrite_noTabs
--- PASS: TestConfigWrite_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConfigWrite - 2019/12/06 06:31:45.113754 [WARN] agent: Node name "Node 76953c09-f7f9-5bd6-dcf6-f2270acec230" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConfigWrite - 2019/12/06 06:31:45.114390 [DEBUG] tlsutil: Update with version 1
TestConfigWrite - 2019/12/06 06:31:45.120237 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:31:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:76953c09-f7f9-5bd6-dcf6-f2270acec230 Address:127.0.0.1:34006}]
2019/12/06 06:31:48 [INFO]  raft: Node at 127.0.0.1:34006 [Follower] entering Follower state (Leader: "")
TestConfigWrite - 2019/12/06 06:31:48.796027 [INFO] serf: EventMemberJoin: Node 76953c09-f7f9-5bd6-dcf6-f2270acec230.dc1 127.0.0.1
TestConfigWrite - 2019/12/06 06:31:48.801504 [INFO] serf: EventMemberJoin: Node 76953c09-f7f9-5bd6-dcf6-f2270acec230 127.0.0.1
TestConfigWrite - 2019/12/06 06:31:48.803153 [INFO] consul: Adding LAN server Node 76953c09-f7f9-5bd6-dcf6-f2270acec230 (Addr: tcp/127.0.0.1:34006) (DC: dc1)
TestConfigWrite - 2019/12/06 06:31:48.804033 [INFO] consul: Handled member-join event for server "Node 76953c09-f7f9-5bd6-dcf6-f2270acec230.dc1" in area "wan"
TestConfigWrite - 2019/12/06 06:31:48.805771 [INFO] agent: Started DNS server 127.0.0.1:34001 (tcp)
TestConfigWrite - 2019/12/06 06:31:48.806359 [INFO] agent: Started DNS server 127.0.0.1:34001 (udp)
TestConfigWrite - 2019/12/06 06:31:48.808705 [INFO] agent: Started HTTP server on 127.0.0.1:34002 (tcp)
TestConfigWrite - 2019/12/06 06:31:48.808811 [INFO] agent: started state syncer
2019/12/06 06:31:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:31:48 [INFO]  raft: Node at 127.0.0.1:34006 [Candidate] entering Candidate state in term 2
2019/12/06 06:31:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:31:50 [INFO]  raft: Node at 127.0.0.1:34006 [Leader] entering Leader state
TestConfigWrite - 2019/12/06 06:31:50.899248 [INFO] consul: cluster leadership acquired
TestConfigWrite - 2019/12/06 06:31:50.899889 [INFO] consul: New leader elected: Node 76953c09-f7f9-5bd6-dcf6-f2270acec230
=== RUN   TestConfigWrite/File
TestConfigWrite - 2019/12/06 06:31:51.772514 [INFO] agent: Synced node info
TestConfigWrite - 2019/12/06 06:31:51.818749 [DEBUG] agent: Node info in sync
TestConfigWrite - 2019/12/06 06:31:51.818855 [DEBUG] agent: Node info in sync
TestConfigWrite - 2019/12/06 06:31:52.535964 [DEBUG] http: Request PUT /v1/config (1.15331413s) from=127.0.0.1:55792
TestConfigWrite - 2019/12/06 06:31:52.548971 [DEBUG] http: Request GET /v1/config/service-defaults/web (1.681706ms) from=127.0.0.1:55796
=== RUN   TestConfigWrite/Stdin
TestConfigWrite - 2019/12/06 06:31:54.264391 [DEBUG] http: Request PUT /v1/config (1.707373004s) from=127.0.0.1:55798
TestConfigWrite - 2019/12/06 06:31:54.268118 [DEBUG] http: Request GET /v1/config/proxy-defaults/global (1.776708ms) from=127.0.0.1:55796
=== RUN   TestConfigWrite/No_config
TestConfigWrite - 2019/12/06 06:31:54.272539 [INFO] agent: Requesting shutdown
TestConfigWrite - 2019/12/06 06:31:54.272604 [INFO] consul: shutting down server
TestConfigWrite - 2019/12/06 06:31:54.272642 [WARN] serf: Shutdown without a Leave
TestConfigWrite - 2019/12/06 06:31:54.405890 [WARN] serf: Shutdown without a Leave
TestConfigWrite - 2019/12/06 06:31:54.604288 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConfigWrite - 2019/12/06 06:31:54.611326 [WARN] consul: error getting server health from "Node 76953c09-f7f9-5bd6-dcf6-f2270acec230": rpc error making call: EOF
TestConfigWrite - 2019/12/06 06:31:54.720640 [INFO] manager: shutting down
TestConfigWrite - 2019/12/06 06:31:55.604419 [WARN] consul: error getting server health from "Node 76953c09-f7f9-5bd6-dcf6-f2270acec230": context deadline exceeded
TestConfigWrite - 2019/12/06 06:31:56.188109 [INFO] agent: consul server down
TestConfigWrite - 2019/12/06 06:31:56.188187 [INFO] agent: shutdown complete
TestConfigWrite - 2019/12/06 06:31:56.188254 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (tcp)
TestConfigWrite - 2019/12/06 06:31:56.188158 [ERR] connect: Apply failed leadership lost while committing log
TestConfigWrite - 2019/12/06 06:31:56.188362 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestConfigWrite - 2019/12/06 06:31:56.188394 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (udp)
TestConfigWrite - 2019/12/06 06:31:56.188713 [INFO] agent: Stopping HTTP server 127.0.0.1:34002 (tcp)
TestConfigWrite - 2019/12/06 06:31:56.189745 [INFO] agent: Waiting for endpoints to shut down
TestConfigWrite - 2019/12/06 06:31:56.189843 [INFO] agent: Endpoints down
--- PASS: TestConfigWrite (11.14s)
    --- PASS: TestConfigWrite/File (1.57s)
    --- PASS: TestConfigWrite/Stdin (1.72s)
    --- PASS: TestConfigWrite/No_config (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/config/write	12.054s
=== RUN   TestConnectCommand_noTabs
=== PAUSE TestConnectCommand_noTabs
=== CONT  TestConnectCommand_noTabs
--- PASS: TestConnectCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect	0.099s
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca	0.071s
=== RUN   TestConnectCAGetConfigCommand_noTabs
=== PAUSE TestConnectCAGetConfigCommand_noTabs
=== RUN   TestConnectCAGetConfigCommand
=== PAUSE TestConnectCAGetConfigCommand
=== CONT  TestConnectCAGetConfigCommand_noTabs
=== CONT  TestConnectCAGetConfigCommand
--- PASS: TestConnectCAGetConfigCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCAGetConfigCommand - 2019/12/06 06:32:09.317289 [WARN] agent: Node name "Node eb80f7b2-2c21-b38b-7d2f-8351e098a5d8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCAGetConfigCommand - 2019/12/06 06:32:09.320214 [DEBUG] tlsutil: Update with version 1
TestConnectCAGetConfigCommand - 2019/12/06 06:32:09.332102 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:32:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eb80f7b2-2c21-b38b-7d2f-8351e098a5d8 Address:127.0.0.1:14506}]
2019/12/06 06:32:10 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestConnectCAGetConfigCommand - 2019/12/06 06:32:10.427760 [INFO] serf: EventMemberJoin: Node eb80f7b2-2c21-b38b-7d2f-8351e098a5d8.dc1 127.0.0.1
TestConnectCAGetConfigCommand - 2019/12/06 06:32:10.431224 [INFO] serf: EventMemberJoin: Node eb80f7b2-2c21-b38b-7d2f-8351e098a5d8 127.0.0.1
TestConnectCAGetConfigCommand - 2019/12/06 06:32:10.432417 [INFO] consul: Adding LAN server Node eb80f7b2-2c21-b38b-7d2f-8351e098a5d8 (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestConnectCAGetConfigCommand - 2019/12/06 06:32:10.432867 [INFO] consul: Handled member-join event for server "Node eb80f7b2-2c21-b38b-7d2f-8351e098a5d8.dc1" in area "wan"
TestConnectCAGetConfigCommand - 2019/12/06 06:32:10.433427 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestConnectCAGetConfigCommand - 2019/12/06 06:32:10.434097 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestConnectCAGetConfigCommand - 2019/12/06 06:32:10.437063 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestConnectCAGetConfigCommand - 2019/12/06 06:32:10.437250 [INFO] agent: started state syncer
2019/12/06 06:32:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:32:10 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/06 06:32:11 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:32:11 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestConnectCAGetConfigCommand - 2019/12/06 06:32:11.314006 [INFO] consul: cluster leadership acquired
TestConnectCAGetConfigCommand - 2019/12/06 06:32:11.314626 [INFO] consul: New leader elected: Node eb80f7b2-2c21-b38b-7d2f-8351e098a5d8
TestConnectCAGetConfigCommand - 2019/12/06 06:32:11.918077 [INFO] agent: Synced node info
TestConnectCAGetConfigCommand - 2019/12/06 06:32:11.918208 [DEBUG] agent: Node info in sync
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.421621 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.432300 [DEBUG] consul: Skipping self join check for "Node eb80f7b2-2c21-b38b-7d2f-8351e098a5d8" since the cluster is too small
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.432545 [INFO] consul: member 'Node eb80f7b2-2c21-b38b-7d2f-8351e098a5d8' joined, marking health alive
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.731677 [DEBUG] http: Request GET /v1/connect/ca/configuration (1.943379ms) from=127.0.0.1:40658
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.734385 [INFO] agent: Requesting shutdown
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.734475 [INFO] consul: shutting down server
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.734518 [WARN] serf: Shutdown without a Leave
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.837675 [WARN] serf: Shutdown without a Leave
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.987732 [INFO] manager: shutting down
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.988073 [INFO] agent: consul server down
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.988125 [INFO] agent: shutdown complete
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.988196 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.988331 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.988497 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.989036 [INFO] agent: Waiting for endpoints to shut down
TestConnectCAGetConfigCommand - 2019/12/06 06:32:13.989120 [INFO] agent: Endpoints down
--- PASS: TestConnectCAGetConfigCommand (4.78s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca/get	5.015s
=== RUN   TestConnectCASetConfigCommand_noTabs
=== PAUSE TestConnectCASetConfigCommand_noTabs
=== RUN   TestConnectCASetConfigCommand
=== PAUSE TestConnectCASetConfigCommand
=== CONT  TestConnectCASetConfigCommand_noTabs
=== CONT  TestConnectCASetConfigCommand
--- PASS: TestConnectCASetConfigCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCASetConfigCommand - 2019/12/06 06:33:01.600911 [WARN] agent: Node name "Node 0479a36f-4eba-ce37-640d-911d9bb9ab44" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCASetConfigCommand - 2019/12/06 06:33:01.602020 [DEBUG] tlsutil: Update with version 1
TestConnectCASetConfigCommand - 2019/12/06 06:33:01.630382 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0479a36f-4eba-ce37-640d-911d9bb9ab44 Address:127.0.0.1:14506}]
2019/12/06 06:33:02 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestConnectCASetConfigCommand - 2019/12/06 06:33:02.993894 [INFO] serf: EventMemberJoin: Node 0479a36f-4eba-ce37-640d-911d9bb9ab44.dc1 127.0.0.1
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.003494 [INFO] serf: EventMemberJoin: Node 0479a36f-4eba-ce37-640d-911d9bb9ab44 127.0.0.1
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.006314 [INFO] consul: Adding LAN server Node 0479a36f-4eba-ce37-640d-911d9bb9ab44 (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.008800 [INFO] consul: Handled member-join event for server "Node 0479a36f-4eba-ce37-640d-911d9bb9ab44.dc1" in area "wan"
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.009883 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.010458 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.013439 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.013628 [INFO] agent: started state syncer
2019/12/06 06:33:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:03 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:03 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:03 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.530432 [INFO] consul: cluster leadership acquired
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.530971 [INFO] consul: New leader elected: Node 0479a36f-4eba-ce37-640d-911d9bb9ab44
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.895373 [INFO] agent: Synced node info
TestConnectCASetConfigCommand - 2019/12/06 06:33:03.895497 [DEBUG] agent: Node info in sync
TestConnectCASetConfigCommand - 2019/12/06 06:33:04.625171 [DEBUG] agent: Node info in sync
TestConnectCASetConfigCommand - 2019/12/06 06:33:04.981005 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectCASetConfigCommand - 2019/12/06 06:33:04.981882 [DEBUG] consul: Skipping self join check for "Node 0479a36f-4eba-ce37-640d-911d9bb9ab44" since the cluster is too small
TestConnectCASetConfigCommand - 2019/12/06 06:33:04.982052 [INFO] consul: member 'Node 0479a36f-4eba-ce37-640d-911d9bb9ab44' joined, marking health alive
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.497282 [INFO] connect: CA provider config updated
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.497451 [DEBUG] http: Request PUT /v1/connect/ca/configuration (202.942715ms) from=127.0.0.1:40690
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.498868 [INFO] agent: Requesting shutdown
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.498944 [INFO] consul: shutting down server
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.499002 [WARN] serf: Shutdown without a Leave
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.555051 [WARN] serf: Shutdown without a Leave
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.613457 [INFO] manager: shutting down
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.613921 [INFO] agent: consul server down
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.613971 [INFO] agent: shutdown complete
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.614040 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.614229 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.614412 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.614977 [INFO] agent: Waiting for endpoints to shut down
TestConnectCASetConfigCommand - 2019/12/06 06:33:05.615067 [INFO] agent: Endpoints down
--- PASS: TestConnectCASetConfigCommand (4.08s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca/set	4.314s
=== RUN   TestBootstrapConfig_ConfigureArgs
=== RUN   TestBootstrapConfig_ConfigureArgs/defaults
=== RUN   TestBootstrapConfig_ConfigureArgs/extra-stats-sinks
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-statsd-sink
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-statsd-sink-plus-extra
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-statsd-sink-env
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-sink
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-unix-sink
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-sink-env
=== RUN   TestBootstrapConfig_ConfigureArgs/stats-config-override
=== RUN   TestBootstrapConfig_ConfigureArgs/simple-tags
=== RUN   TestBootstrapConfig_ConfigureArgs/prometheus-bind-addr
=== RUN   TestBootstrapConfig_ConfigureArgs/prometheus-bind-addr-with-overrides
=== RUN   TestBootstrapConfig_ConfigureArgs/stats-flush-interval
=== RUN   TestBootstrapConfig_ConfigureArgs/override-tracing
=== RUN   TestBootstrapConfig_ConfigureArgs/err-bad-prometheus-addr
=== RUN   TestBootstrapConfig_ConfigureArgs/err-bad-statsd-addr
=== RUN   TestBootstrapConfig_ConfigureArgs/err-bad-dogstatsd-addr
--- PASS: TestBootstrapConfig_ConfigureArgs (0.06s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/defaults (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/extra-stats-sinks (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-statsd-sink (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-statsd-sink-plus-extra (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-statsd-sink-env (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-sink (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-unix-sink (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-dogstatsd-sink-env (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/stats-config-override (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/simple-tags (0.01s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/prometheus-bind-addr (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/prometheus-bind-addr-with-overrides (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/stats-flush-interval (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/override-tracing (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/err-bad-prometheus-addr (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/err-bad-statsd-addr (0.00s)
    --- PASS: TestBootstrapConfig_ConfigureArgs/err-bad-dogstatsd-addr (0.00s)
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== RUN   TestGenerateConfig
=== RUN   TestGenerateConfig/no-args
=== RUN   TestGenerateConfig/defaults
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/token-arg
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": "c9a52720-bf6c-4aa6-b8bc-66881a5ade95"
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/token-env
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": "c9a52720-bf6c-4aa6-b8bc-66881a5ade95"
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/token-file-arg
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": "c9a52720-bf6c-4aa6-b8bc-66881a5ade95"
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/token-file-env
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": "c9a52720-bf6c-4aa6-b8bc-66881a5ade95"
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/grpc-addr-flag
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 9999
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/grpc-addr-env
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 9999
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/access-log-path
{
  "admin": {
    "access_log_path": "/some/path/access.log",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/custom-bootstrap

				{
					"admin": {
						"access_log_path": "/dev/null",
						"address": {
							"socket_address": {
								"address": "127.0.0.1",
								"port_value": 19000
							}
						}
					},
					"node": {
						"cluster": "test-proxy",
						"id": "test-proxy"
					},
					custom_field = "foo"
				}=== RUN   TestGenerateConfig/extra_-single
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      },
      
				{
					"name": "fake_cluster_1"
				}
    ],
    "listeners": [
      
				{
					"name": "fake_listener_1"
				}
    ]
  },
  "stats_sinks": [

				{
					"name": "fake_sink_1"
				}
],
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/extra_-multiple
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      },
      
				{
					"name": "fake_cluster_1"
				},
				{
					"name": "fake_cluster_2"
				}
    ],
    "listeners": [
      
				{
					"name": "fake_listener_1"
				},{
					"name": "fake_listener_2"
				}
    ]
  },
  "stats_sinks": [

				{
					"name": "fake_sink_1"
				} , { "name": "fake_sink_2" }
],
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/stats-config-override
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "stats_config": 
				{
					"name": "fake_config"
				},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/zipkin-tracing-config
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      },
      {
					"name": "zipkin",
					"type": "STRICT_DNS",
					"connect_timeout": "5s",
					"load_assignment": {
						"cluster_name": "zipkin",
						"endpoints": [
							{
								"lb_endpoints": [
									{
										"endpoint": {
											"address": {
												"socket_address": {
													"address": "zipkin.service.consul",
													"port_value": 9411
												}
											}
										}
									}
								]
							}
						]
					}
				}
    ]
  },
  "stats_config": {
			"stats_tags": [
				{
			"tag_name": "local_cluster",
			"fixed_value": "test-proxy"
		}
			],
			"use_all_default_tags": true
		},
  "tracing": {
					"http": {
						"name": "envoy.zipkin",
						"config": {
							"collector_cluster": "zipkin",
							"collector_endpoint": "/api/v1/spans"
						}
					}
				},
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
--- PASS: TestGenerateConfig (0.84s)
    --- PASS: TestGenerateConfig/no-args (0.02s)
    --- PASS: TestGenerateConfig/defaults (0.27s)
    --- PASS: TestGenerateConfig/token-arg (0.08s)
    --- PASS: TestGenerateConfig/token-env (0.03s)
    --- PASS: TestGenerateConfig/token-file-arg (0.05s)
    --- PASS: TestGenerateConfig/token-file-env (0.03s)
    --- PASS: TestGenerateConfig/grpc-addr-flag (0.04s)
    --- PASS: TestGenerateConfig/grpc-addr-env (0.04s)
    --- PASS: TestGenerateConfig/access-log-path (0.04s)
    --- PASS: TestGenerateConfig/custom-bootstrap (0.05s)
    --- PASS: TestGenerateConfig/extra_-single (0.03s)
    --- PASS: TestGenerateConfig/extra_-multiple (0.06s)
    --- PASS: TestGenerateConfig/stats-config-override (0.04s)
    --- PASS: TestGenerateConfig/zipkin-tracing-config (0.04s)
=== RUN   TestExecEnvoy
=== RUN   TestExecEnvoy/default
=== RUN   TestExecEnvoy/hot-restart-epoch
=== RUN   TestExecEnvoy/hot-restart-version
=== RUN   TestExecEnvoy/hot-restart-version#01
=== RUN   TestExecEnvoy/hot-restart-version#02
--- PASS: TestExecEnvoy (6.00s)
    --- PASS: TestExecEnvoy/default (3.10s)
    --- PASS: TestExecEnvoy/hot-restart-epoch (0.68s)
    --- PASS: TestExecEnvoy/hot-restart-version (0.77s)
    --- PASS: TestExecEnvoy/hot-restart-version#01 (0.69s)
    --- PASS: TestExecEnvoy/hot-restart-version#02 (0.76s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/connect/envoy	7.361s
=== RUN   TestConnectEnvoyPipeBootstrapCommand_noTabs
=== PAUSE TestConnectEnvoyPipeBootstrapCommand_noTabs
=== CONT  TestConnectEnvoyPipeBootstrapCommand_noTabs
--- PASS: TestConnectEnvoyPipeBootstrapCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap	0.045s
=== RUN   TestFlagUpstreams_impl
--- PASS: TestFlagUpstreams_impl (0.00s)
=== RUN   TestFlagUpstreams
=== RUN   TestFlagUpstreams/bad_format
=== RUN   TestFlagUpstreams/port_not_int
=== RUN   TestFlagUpstreams/4_parts
=== RUN   TestFlagUpstreams/single_value
=== RUN   TestFlagUpstreams/single_value_prepared_query
=== RUN   TestFlagUpstreams/invalid_type
=== RUN   TestFlagUpstreams/address_specified
=== RUN   TestFlagUpstreams/repeat_value,_overwrite
--- PASS: TestFlagUpstreams (0.00s)
    --- PASS: TestFlagUpstreams/bad_format (0.00s)
    --- PASS: TestFlagUpstreams/port_not_int (0.00s)
    --- PASS: TestFlagUpstreams/4_parts (0.00s)
    --- PASS: TestFlagUpstreams/single_value (0.00s)
    --- PASS: TestFlagUpstreams/single_value_prepared_query (0.00s)
    --- PASS: TestFlagUpstreams/invalid_type (0.00s)
    --- PASS: TestFlagUpstreams/address_specified (0.00s)
    --- PASS: TestFlagUpstreams/repeat_value,_overwrite (0.00s)
=== RUN   TestCommandConfigWatcher
=== PAUSE TestCommandConfigWatcher
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== RUN   TestRegisterMonitor_good
=== PAUSE TestRegisterMonitor_good
=== RUN   TestRegisterMonitor_heartbeat
=== PAUSE TestRegisterMonitor_heartbeat
=== CONT  TestCommandConfigWatcher
=== CONT  TestRegisterMonitor_good
=== CONT  TestCatalogCommand_noTabs
=== RUN   TestCommandConfigWatcher/-service_flag_only
--- PASS: TestCatalogCommand_noTabs (0.00s)
=== CONT  TestRegisterMonitor_heartbeat
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:09.479249 [WARN] agent: Node name "Node 8ff8de57-ee24-19b7-b176-c73506c07bc4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:09.480332 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestRegisterMonitor_good - 2019/12/06 06:33:09.485775 [WARN] agent: Node name "Node 560320fc-06e2-8913-b5ee-a028351d5117" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRegisterMonitor_good - 2019/12/06 06:33:09.486575 [DEBUG] tlsutil: Update with version 1
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:09.488944 [WARN] agent: Node name "Node bacf3c14-56d7-063a-afe3-38eeb3983c62" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:09.492961 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_good - 2019/12/06 06:33:09.493196 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:09.505019 [DEBUG] tlsutil: Update with version 1
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:09.509040 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bacf3c14-56d7-063a-afe3-38eeb3983c62 Address:127.0.0.1:10018}]
2019/12/06 06:33:11 [INFO]  raft: Node at 127.0.0.1:10018 [Follower] entering Follower state (Leader: "")
2019/12/06 06:33:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:560320fc-06e2-8913-b5ee-a028351d5117 Address:127.0.0.1:10006}]
2019/12/06 06:33:11 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
2019/12/06 06:33:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8ff8de57-ee24-19b7-b176-c73506c07bc4 Address:127.0.0.1:10012}]
2019/12/06 06:33:11 [INFO]  raft: Node at 127.0.0.1:10012 [Follower] entering Follower state (Leader: "")
TestRegisterMonitor_good - 2019/12/06 06:33:11.396328 [INFO] serf: EventMemberJoin: Node 560320fc-06e2-8913-b5ee-a028351d5117.dc1 127.0.0.1
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:11.398131 [INFO] serf: EventMemberJoin: Node bacf3c14-56d7-063a-afe3-38eeb3983c62.dc1 127.0.0.1
TestRegisterMonitor_good - 2019/12/06 06:33:11.399995 [INFO] serf: EventMemberJoin: Node 560320fc-06e2-8913-b5ee-a028351d5117 127.0.0.1
TestRegisterMonitor_good - 2019/12/06 06:33:11.401229 [INFO] consul: Handled member-join event for server "Node 560320fc-06e2-8913-b5ee-a028351d5117.dc1" in area "wan"
TestRegisterMonitor_good - 2019/12/06 06:33:11.401593 [INFO] consul: Adding LAN server Node 560320fc-06e2-8913-b5ee-a028351d5117 (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestRegisterMonitor_good - 2019/12/06 06:33:11.401771 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestRegisterMonitor_good - 2019/12/06 06:33:11.402172 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:11.403193 [INFO] serf: EventMemberJoin: Node 8ff8de57-ee24-19b7-b176-c73506c07bc4.dc1 127.0.0.1
TestRegisterMonitor_good - 2019/12/06 06:33:11.405416 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestRegisterMonitor_good - 2019/12/06 06:33:11.405596 [INFO] agent: started state syncer
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:11.409154 [INFO] serf: EventMemberJoin: Node 8ff8de57-ee24-19b7-b176-c73506c07bc4 127.0.0.1
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:11.410189 [INFO] consul: Adding LAN server Node 8ff8de57-ee24-19b7-b176-c73506c07bc4 (Addr: tcp/127.0.0.1:10012) (DC: dc1)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:11.410489 [INFO] serf: EventMemberJoin: Node bacf3c14-56d7-063a-afe3-38eeb3983c62 127.0.0.1
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:11.410492 [INFO] consul: Handled member-join event for server "Node 8ff8de57-ee24-19b7-b176-c73506c07bc4.dc1" in area "wan"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:11.411743 [INFO] consul: Adding LAN server Node bacf3c14-56d7-063a-afe3-38eeb3983c62 (Addr: tcp/127.0.0.1:10018) (DC: dc1)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:11.411759 [INFO] consul: Handled member-join event for server "Node bacf3c14-56d7-063a-afe3-38eeb3983c62.dc1" in area "wan"
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:11.425240 [INFO] agent: Started DNS server 127.0.0.1:10007 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:11.425348 [INFO] agent: Started DNS server 127.0.0.1:10007 (udp)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:11.428910 [INFO] agent: Started DNS server 127.0.0.1:10013 (udp)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:11.438483 [INFO] agent: Started DNS server 127.0.0.1:10013 (tcp)
2019/12/06 06:33:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:11 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:11.440367 [INFO] agent: Started HTTP server on 127.0.0.1:10008 (tcp)
2019/12/06 06:33:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:11 [INFO]  raft: Node at 127.0.0.1:10012 [Candidate] entering Candidate state in term 2
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:11.445266 [INFO] agent: Started HTTP server on 127.0.0.1:10014 (tcp)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:11.445389 [INFO] agent: started state syncer
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:11.448387 [INFO] agent: started state syncer
2019/12/06 06:33:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:11 [INFO]  raft: Node at 127.0.0.1:10018 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:12 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:10018 [Leader] entering Leader state
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:12.348462 [INFO] consul: cluster leadership acquired
2019/12/06 06:33:12 [INFO]  raft: Election won. Tally: 1
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:12.349110 [INFO] consul: New leader elected: Node bacf3c14-56d7-063a-afe3-38eeb3983c62
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
2019/12/06 06:33:12 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:10012 [Leader] entering Leader state
TestRegisterMonitor_good - 2019/12/06 06:33:12.349723 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.349852 [INFO] consul: cluster leadership acquired
TestRegisterMonitor_good - 2019/12/06 06:33:12.350136 [INFO] consul: New leader elected: Node 560320fc-06e2-8913-b5ee-a028351d5117
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.350161 [INFO] consul: New leader elected: Node 8ff8de57-ee24-19b7-b176-c73506c07bc4
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.354838 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.355047 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.355110 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.355841 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.525868 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_good - 2019/12/06 06:33:12.591925 [DEBUG] http: Request GET /v1/catalog/service/foo-proxy?stale= (4.46877ms) from=127.0.0.1:46314
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.663734 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.663859 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.663880 [WARN] agent: Syncing service "one-sidecar" failed. raft is already shutdown
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.663970 [ERR] agent: failed to sync remote state: raft is already shutdown
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.664377 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.664444 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.664503 [INFO] agent: Stopping DNS server 127.0.0.1:10007 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.664659 [INFO] agent: Stopping DNS server 127.0.0.1:10007 (udp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.664829 [INFO] agent: Stopping HTTP server 127.0.0.1:10008 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.665058 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_only - 2019/12/06 06:33:12.665142 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service_flag_with_upstreams
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:12.782754 [WARN] agent: Node name "Node a3c03942-7860-c16d-2f68-4d01f6108505" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:12.783295 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:12.790573 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_good - 2019/12/06 06:33:12.931706 [INFO] agent: Synced node info
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:12.934707 [INFO] agent: Synced node info
TestRegisterMonitor_good - 2019/12/06 06:33:12.951608 [DEBUG] http: Request GET /v1/agent/services (364.173129ms) from=127.0.0.1:46312
TestRegisterMonitor_good - 2019/12/06 06:33:12.955088 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestRegisterMonitor_good - 2019/12/06 06:33:12.997228 [DEBUG] http: Request GET /v1/agent/services (1.215695ms) from=127.0.0.1:46312
TestRegisterMonitor_good - 2019/12/06 06:33:13.868193 [INFO] agent: Synced service "foo-proxy"
TestRegisterMonitor_good - 2019/12/06 06:33:13.868339 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_good - 2019/12/06 06:33:13.868381 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/06 06:33:13.868452 [DEBUG] http: Request PUT /v1/agent/service/register (1.272051892s) from=127.0.0.1:46314
2019/12/06 06:33:13 [INFO] proxy: registered Consul service: foo-proxy
2019/12/06 06:33:13 [INFO] proxy: stop request received, deregistering
TestRegisterMonitor_good - 2019/12/06 06:33:13.906426 [DEBUG] agent: removed check "foo-proxy-ttl"
TestRegisterMonitor_good - 2019/12/06 06:33:13.906648 [DEBUG] agent: removed service "foo-proxy"
2019/12/06 06:33:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a3c03942-7860-c16d-2f68-4d01f6108505 Address:127.0.0.1:10024}]
2019/12/06 06:33:14 [INFO]  raft: Node at 127.0.0.1:10024 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:14.008606 [INFO] serf: EventMemberJoin: Node a3c03942-7860-c16d-2f68-4d01f6108505.dc1 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:14.017029 [INFO] serf: EventMemberJoin: Node a3c03942-7860-c16d-2f68-4d01f6108505 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:14.017921 [INFO] consul: Adding LAN server Node a3c03942-7860-c16d-2f68-4d01f6108505 (Addr: tcp/127.0.0.1:10024) (DC: dc1)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:14.018198 [INFO] consul: Handled member-join event for server "Node a3c03942-7860-c16d-2f68-4d01f6108505.dc1" in area "wan"
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:14.020637 [INFO] agent: Started DNS server 127.0.0.1:10019 (udp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:14.021198 [INFO] agent: Started DNS server 127.0.0.1:10019 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:14.023473 [INFO] agent: Started HTTP server on 127.0.0.1:10020 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:14.023569 [INFO] agent: started state syncer
2019/12/06 06:33:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:14 [INFO]  raft: Node at 127.0.0.1:10024 [Candidate] entering Candidate state in term 2
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:14.304309 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:14.304427 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/06 06:33:14.616798 [INFO] agent: Deregistered service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:14.790742 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:14.791300 [DEBUG] consul: Skipping self join check for "Node bacf3c14-56d7-063a-afe3-38eeb3983c62" since the cluster is too small
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:14.791471 [INFO] consul: member 'Node bacf3c14-56d7-063a-afe3-38eeb3983c62' joined, marking health alive
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.097719 [DEBUG] http: Request GET /v1/agent/services (1.116026ms) from=127.0.0.1:51582
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.100319 [DEBUG] http: Request GET /v1/catalog/service/foo-proxy?stale= (1.263029ms) from=127.0.0.1:51584
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.132080 [DEBUG] http: Request GET /v1/agent/services (2.01638ms) from=127.0.0.1:51582
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.137067 [DEBUG] http: Request GET /v1/agent/checks (722.683µs) from=127.0.0.1:51582
TestRegisterMonitor_good - 2019/12/06 06:33:15.270460 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_good - 2019/12/06 06:33:15.271069 [INFO] agent: Deregistered check "foo-proxy-ttl"
TestRegisterMonitor_good - 2019/12/06 06:33:15.271133 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/06 06:33:15.271220 [DEBUG] http: Request PUT /v1/agent/service/deregister/foo-proxy (1.374618609s) from=127.0.0.1:46314
TestRegisterMonitor_good - 2019/12/06 06:33:15.271597 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/12/06 06:33:15.271687 [DEBUG] agent: Node info in sync
2019/12/06 06:33:15 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:15 [INFO]  raft: Node at 127.0.0.1:10024 [Leader] entering Leader state
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.272118 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.272584 [INFO] consul: New leader elected: Node a3c03942-7860-c16d-2f68-4d01f6108505
TestRegisterMonitor_good - 2019/12/06 06:33:15.289511 [DEBUG] http: Request GET /v1/agent/services (631.015µs) from=127.0.0.1:46314
TestRegisterMonitor_good - 2019/12/06 06:33:15.290886 [INFO] agent: Requesting shutdown
TestRegisterMonitor_good - 2019/12/06 06:33:15.290978 [INFO] consul: shutting down server
TestRegisterMonitor_good - 2019/12/06 06:33:15.291027 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.325742 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.326018 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.326086 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.326892 [ERR] agent: failed to sync remote state: No cluster leader
TestRegisterMonitor_good - 2019/12/06 06:33:15.471958 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.477352 [DEBUG] agent: Check "foo-proxy-ttl" status is now critical
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.630330 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_good - 2019/12/06 06:33:15.631328 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRegisterMonitor_good - 2019/12/06 06:33:15.631470 [INFO] manager: shutting down
TestRegisterMonitor_good - 2019/12/06 06:33:15.631828 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestRegisterMonitor_good - 2019/12/06 06:33:15.632116 [ERR] consul: failed to reconcile member: {Node 560320fc-06e2-8913-b5ee-a028351d5117 127.0.0.1 10004 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:560320fc-06e2-8913-b5ee-a028351d5117 port:10006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:10005] alive 1 5 2 2 5 4}: raft is already shutdown
TestRegisterMonitor_good - 2019/12/06 06:33:15.632220 [INFO] agent: consul server down
TestRegisterMonitor_good - 2019/12/06 06:33:15.632270 [INFO] agent: shutdown complete
TestRegisterMonitor_good - 2019/12/06 06:33:15.632330 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestRegisterMonitor_good - 2019/12/06 06:33:15.632485 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestRegisterMonitor_good - 2019/12/06 06:33:15.634133 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.634341 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_good - 2019/12/06 06:33:15.633851 [WARN] consul: error getting server health from "Node 560320fc-06e2-8913-b5ee-a028351d5117": rpc error making call: EOF
2019/12/06 06:33:15 [ERR] yamux: Failed to write body: write tcp 127.0.0.1:10006->127.0.0.1:53447: write: broken pipe
TestRegisterMonitor_good - 2019/12/06 06:33:15.635351 [INFO] agent: Waiting for endpoints to shut down
TestRegisterMonitor_good - 2019/12/06 06:33:15.635448 [INFO] agent: Endpoints down
--- PASS: TestRegisterMonitor_good (6.33s)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.785218 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.785472 [WARN] agent: Syncing service "one-sidecar-sidecar-proxy" failed. raft is already shutdown
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.785540 [ERR] agent: failed to sync remote state: raft is already shutdown
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.913900 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.913977 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.914041 [INFO] agent: Stopping DNS server 127.0.0.1:10019 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.914376 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.914612 [INFO] agent: Stopping DNS server 127.0.0.1:10019 (udp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.914804 [INFO] agent: Stopping HTTP server 127.0.0.1:10020 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.915022 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/12/06 06:33:15.915267 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service_flag_with_-service-addr
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.992638 [INFO] agent: Synced service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.992748 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.992786 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:15.992882 [DEBUG] http: Request PUT /v1/agent/service/register (888.886989ms) from=127.0.0.1:51584
2019/12/06 06:33:15 [INFO] proxy: registered Consul service: foo-proxy
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.001937 [DEBUG] agent: Service "foo-proxy" in sync
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:16.013994 [WARN] agent: Node name "Node 2d5086e9-0bde-f896-8aae-23a5c1954884" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:16.014691 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:16.017894 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.062546 [DEBUG] agent: Check "foo-proxy-ttl" status is now passing
TestRegisterMonitor_good - 2019/12/06 06:33:16.271590 [WARN] consul: error getting server health from "Node 560320fc-06e2-8913-b5ee-a028351d5117": context deadline exceeded
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499003 [INFO] agent: Synced check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499083 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499253 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499315 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499367 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499497 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499560 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499598 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.499695 [DEBUG] http: Request PUT /v1/agent/check/fail/foo-proxy-ttl?note= (1.357671215s) from=127.0.0.1:51582
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.502018 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.503296 [DEBUG] http: Request GET /v1/agent/checks (1.759041ms) from=127.0.0.1:51582
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.707815 [INFO] agent: Synced check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.707895 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.707987 [DEBUG] http: Request PUT /v1/agent/check/pass/foo-proxy-ttl?note= (645.480999ms) from=127.0.0.1:51584
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.708092 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.708154 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.708193 [DEBUG] agent: Node info in sync
2019/12/06 06:33:16 [INFO] proxy: stop request received, deregistering
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:16.891764 [INFO] agent: Deregistered service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.172345 [INFO] agent: Deregistered check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.172460 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.172591 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.174087 [DEBUG] agent: removed check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.174161 [DEBUG] agent: removed service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.175511 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.175582 [DEBUG] http: Request PUT /v1/agent/service/deregister/foo-proxy (465.452482ms) from=127.0.0.1:51584
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.176230 [INFO] agent: Requesting shutdown
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.176313 [INFO] consul: shutting down server
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.176362 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.280287 [WARN] serf: Shutdown without a Leave
2019/12/06 06:33:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2d5086e9-0bde-f896-8aae-23a5c1954884 Address:127.0.0.1:10030}]
2019/12/06 06:33:17 [INFO]  raft: Node at 127.0.0.1:10030 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:17.285739 [INFO] serf: EventMemberJoin: Node 2d5086e9-0bde-f896-8aae-23a5c1954884.dc1 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:17.290664 [INFO] serf: EventMemberJoin: Node 2d5086e9-0bde-f896-8aae-23a5c1954884 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:17.291817 [INFO] consul: Handled member-join event for server "Node 2d5086e9-0bde-f896-8aae-23a5c1954884.dc1" in area "wan"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:17.292173 [INFO] consul: Adding LAN server Node 2d5086e9-0bde-f896-8aae-23a5c1954884 (Addr: tcp/127.0.0.1:10030) (DC: dc1)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:17.294479 [INFO] agent: Started DNS server 127.0.0.1:10025 (udp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:17.295110 [INFO] agent: Started DNS server 127.0.0.1:10025 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:17.297574 [INFO] agent: Started HTTP server on 127.0.0.1:10026 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:17.297796 [INFO] agent: started state syncer
2019/12/06 06:33:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:17 [INFO]  raft: Node at 127.0.0.1:10030 [Candidate] entering Candidate state in term 2
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.405835 [INFO] manager: shutting down
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.406873 [INFO] agent: consul server down
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.406965 [INFO] agent: shutdown complete
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.407057 [INFO] agent: Stopping DNS server 127.0.0.1:10013 (tcp)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.407264 [INFO] agent: Stopping DNS server 127.0.0.1:10013 (udp)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.407487 [INFO] agent: Stopping HTTP server 127.0.0.1:10014 (tcp)
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.408223 [INFO] agent: Waiting for endpoints to shut down
TestRegisterMonitor_heartbeat - 2019/12/06 06:33:17.408320 [INFO] agent: Endpoints down
--- PASS: TestRegisterMonitor_heartbeat (8.10s)
2019/12/06 06:33:18 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:18 [INFO]  raft: Node at 127.0.0.1:10030 [Leader] entering Leader state
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.007264 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.007721 [INFO] consul: New leader elected: Node 2d5086e9-0bde-f896-8aae-23a5c1954884
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.087651 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.088094 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.088169 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.088231 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.155343 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.247072 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.414131 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.414301 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.414381 [INFO] agent: Stopping DNS server 127.0.0.1:10025 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.414600 [INFO] agent: Stopping DNS server 127.0.0.1:10025 (udp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.414818 [INFO] agent: Stopping HTTP server 127.0.0.1:10026 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.415081 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.415171 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service,_-service-addr,_-listen
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/12/06 06:33:18.417903 [ERR] consul: failed to wait for barrier: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:18.483855 [WARN] agent: Node name "Node c7f83723-812b-5a39-92b1-4e7e264588f1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:18.484388 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:18.486700 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c7f83723-812b-5a39-92b1-4e7e264588f1 Address:127.0.0.1:10036}]
2019/12/06 06:33:19 [INFO]  raft: Node at 127.0.0.1:10036 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:19.601614 [INFO] serf: EventMemberJoin: Node c7f83723-812b-5a39-92b1-4e7e264588f1.dc1 127.0.0.1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:19.606867 [INFO] serf: EventMemberJoin: Node c7f83723-812b-5a39-92b1-4e7e264588f1 127.0.0.1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:19.607771 [INFO] consul: Handled member-join event for server "Node c7f83723-812b-5a39-92b1-4e7e264588f1.dc1" in area "wan"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:19.608089 [INFO] consul: Adding LAN server Node c7f83723-812b-5a39-92b1-4e7e264588f1 (Addr: tcp/127.0.0.1:10036) (DC: dc1)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:19.610293 [INFO] agent: Started DNS server 127.0.0.1:10031 (udp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:19.610693 [INFO] agent: Started DNS server 127.0.0.1:10031 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:19.615074 [INFO] agent: Started HTTP server on 127.0.0.1:10032 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:19.615186 [INFO] agent: started state syncer
2019/12/06 06:33:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:19 [INFO]  raft: Node at 127.0.0.1:10036 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:20 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:20 [INFO]  raft: Node at 127.0.0.1:10036 [Leader] entering Leader state
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.265307 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.265775 [INFO] consul: New leader elected: Node c7f83723-812b-5a39-92b1-4e7e264588f1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.431936 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.431941 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.432249 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.475844 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.476100 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.476164 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.476264 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.597063 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.697091 [INFO] manager: shutting down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.788312 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.839703 [INFO] agent: consul server down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.839791 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.839828 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.839849 [INFO] agent: Stopping DNS server 127.0.0.1:10031 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.840052 [INFO] agent: Stopping DNS server 127.0.0.1:10031 (udp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.840223 [INFO] agent: Stopping HTTP server 127.0.0.1:10032 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.840439 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/12/06 06:33:20.840516 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_no_sidecar
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:20.909683 [WARN] agent: Node name "Node d7c367de-bb3f-4c75-a8d2-5f508a05b5b7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:20.910292 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:20.912671 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d7c367de-bb3f-4c75-a8d2-5f508a05b5b7 Address:127.0.0.1:10042}]
2019/12/06 06:33:21 [INFO]  raft: Node at 127.0.0.1:10042 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:21.834538 [INFO] serf: EventMemberJoin: Node d7c367de-bb3f-4c75-a8d2-5f508a05b5b7.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:21.837731 [INFO] serf: EventMemberJoin: Node d7c367de-bb3f-4c75-a8d2-5f508a05b5b7 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:21.838838 [INFO] consul: Adding LAN server Node d7c367de-bb3f-4c75-a8d2-5f508a05b5b7 (Addr: tcp/127.0.0.1:10042) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:21.839146 [INFO] consul: Handled member-join event for server "Node d7c367de-bb3f-4c75-a8d2-5f508a05b5b7.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:21.840129 [INFO] agent: Started DNS server 127.0.0.1:10037 (udp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:21.840498 [INFO] agent: Started DNS server 127.0.0.1:10037 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:21.842731 [INFO] agent: Started HTTP server on 127.0.0.1:10038 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:21.842821 [INFO] agent: started state syncer
2019/12/06 06:33:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:21 [INFO]  raft: Node at 127.0.0.1:10042 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:22 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:22 [INFO]  raft: Node at 127.0.0.1:10042 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.384022 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.384608 [INFO] consul: New leader elected: Node d7c367de-bb3f-4c75-a8d2-5f508a05b5b7
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.415020 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.415206 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.415463 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.419973 [DEBUG] http: Request GET /v1/agent/services (1.179694ms) from=127.0.0.1:60852
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.424455 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.425043 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.425494 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.425813 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:22.547195 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.192610 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.192769 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.193253 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.193303 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.193365 [INFO] agent: Stopping DNS server 127.0.0.1:10037 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.193495 [INFO] agent: Stopping DNS server 127.0.0.1:10037 (udp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.193638 [INFO] agent: Stopping HTTP server 127.0.0.1:10038 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.194076 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/12/06 06:33:24.194233 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:24.268112 [WARN] agent: Node name "Node 1bc31b5d-65df-d70d-b86d-90dca81f81b6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:24.268656 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:24.271966 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1bc31b5d-65df-d70d-b86d-90dca81f81b6 Address:127.0.0.1:10048}]
2019/12/06 06:33:26 [INFO]  raft: Node at 127.0.0.1:10048 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:26.452533 [INFO] serf: EventMemberJoin: Node 1bc31b5d-65df-d70d-b86d-90dca81f81b6.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:26.457119 [INFO] serf: EventMemberJoin: Node 1bc31b5d-65df-d70d-b86d-90dca81f81b6 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:26.458211 [INFO] consul: Handled member-join event for server "Node 1bc31b5d-65df-d70d-b86d-90dca81f81b6.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:26.458565 [INFO] consul: Adding LAN server Node 1bc31b5d-65df-d70d-b86d-90dca81f81b6 (Addr: tcp/127.0.0.1:10048) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:26.460344 [INFO] agent: Started DNS server 127.0.0.1:10043 (udp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:26.461947 [INFO] agent: Started DNS server 127.0.0.1:10043 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:26.469044 [INFO] agent: Started HTTP server on 127.0.0.1:10044 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:26.469265 [INFO] agent: started state syncer
2019/12/06 06:33:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:26 [INFO]  raft: Node at 127.0.0.1:10048 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:27 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:27 [INFO]  raft: Node at 127.0.0.1:10048 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:27.573414 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:27.573879 [INFO] consul: New leader elected: Node 1bc31b5d-65df-d70d-b86d-90dca81f81b6
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:27.714864 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:27.714966 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:27.715085 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:27.890370 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:28.460628 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:29.842250 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.015075 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.392594 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.394711 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.395534 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.615011 [DEBUG] consul: Skipping self join check for "Node 1bc31b5d-65df-d70d-b86d-90dca81f81b6" since the cluster is too small
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.615242 [INFO] consul: member 'Node 1bc31b5d-65df-d70d-b86d-90dca81f81b6' joined, marking health alive
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.615528 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.615611 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.615669 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.615721 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.615766 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.615799 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.635029 [DEBUG] http: Request GET /v1/agent/services (3.010472291s) from=127.0.0.1:54864
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.643068 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.643505 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.643677 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.797300 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.818120 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.818631 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.818854 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.819020 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.819299 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.819513 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.819670 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.819859 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.820062 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.820231 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.820393 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.820573 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.820826 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.821016 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.821188 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.821360 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.821521 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.821693 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.821999 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.822215 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.822399 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.822582 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.822748 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.855744 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.857010 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.857289 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.857464 [INFO] agent: Stopping DNS server 127.0.0.1:10043 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.857758 [INFO] agent: Stopping DNS server 127.0.0.1:10043 (udp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.858187 [INFO] agent: Stopping HTTP server 127.0.0.1:10044 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.859284 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/12/06 06:33:30.859493 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_non-existent
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:30.924461 [WARN] agent: Node name "Node a312c059-aa3c-86bf-753d-8a3da429e200" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:30.925142 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:30.927639 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a312c059-aa3c-86bf-753d-8a3da429e200 Address:127.0.0.1:10054}]
2019/12/06 06:33:31 [INFO]  raft: Node at 127.0.0.1:10054 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:31.693036 [INFO] serf: EventMemberJoin: Node a312c059-aa3c-86bf-753d-8a3da429e200.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:31.696364 [INFO] serf: EventMemberJoin: Node a312c059-aa3c-86bf-753d-8a3da429e200 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:31.698440 [INFO] agent: Started DNS server 127.0.0.1:10049 (udp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:31.701813 [INFO] consul: Adding LAN server Node a312c059-aa3c-86bf-753d-8a3da429e200 (Addr: tcp/127.0.0.1:10054) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:31.703190 [INFO] agent: Started DNS server 127.0.0.1:10049 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:31.704109 [INFO] consul: Handled member-join event for server "Node a312c059-aa3c-86bf-753d-8a3da429e200.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:31.705823 [INFO] agent: Started HTTP server on 127.0.0.1:10050 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:31.709341 [INFO] agent: started state syncer
2019/12/06 06:33:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:31 [INFO]  raft: Node at 127.0.0.1:10054 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:32 [INFO]  raft: Node at 127.0.0.1:10054 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.181188 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.181577 [INFO] consul: New leader elected: Node a312c059-aa3c-86bf-753d-8a3da429e200
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.341625 [DEBUG] http: Request GET /v1/agent/services (1.081692ms) from=127.0.0.1:55454
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.345999 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.346205 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.346266 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.346824 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.505476 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.638867 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.780780 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.780858 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.780920 [INFO] agent: Stopping DNS server 127.0.0.1:10049 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.781052 [INFO] agent: Stopping DNS server 127.0.0.1:10049 (udp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.781218 [INFO] agent: Stopping HTTP server 127.0.0.1:10050 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.781719 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.781823 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.782053 [INFO] agent: Endpoints down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/12/06 06:33:32.782126 [ERR] consul: failed to establish leadership: raft is already shutdown
=== RUN   TestCommandConfigWatcher/-sidecar-for,_one_sidecar
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:32.911882 [WARN] agent: Node name "Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:32.912369 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:32.929919 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c414ca31-4868-8aa9-4ea0-1145b7a36ec0 Address:127.0.0.1:10060}]
2019/12/06 06:33:33 [INFO]  raft: Node at 127.0.0.1:10060 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:33.742564 [INFO] serf: EventMemberJoin: Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:33.746097 [INFO] serf: EventMemberJoin: Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:33.747347 [INFO] consul: Adding LAN server Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0 (Addr: tcp/127.0.0.1:10060) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:33.747912 [INFO] consul: Handled member-join event for server "Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:33.754997 [INFO] agent: Started DNS server 127.0.0.1:10055 (udp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:33.755339 [INFO] agent: Started DNS server 127.0.0.1:10055 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:33.757601 [INFO] agent: Started HTTP server on 127.0.0.1:10056 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:33.757689 [INFO] agent: started state syncer
2019/12/06 06:33:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:33 [INFO]  raft: Node at 127.0.0.1:10060 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:34 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:34 [INFO]  raft: Node at 127.0.0.1:10060 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:34.456037 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:34.456500 [INFO] consul: New leader elected: Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:34.588020 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:34.588313 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:34.588567 [ERR] leaf watch error: invalid type for leaf response: <nil>
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:34.987166 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:35.160947 [WARN] agent: Check "service:two-sidecars-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:21000: connect: connection refused
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:36.024073 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:36.348764 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:37.131813 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:37.958765 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625162 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625316 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625399 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625454 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625498 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625538 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625565 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625692 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625739 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625777 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625810 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625843 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625889 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625929 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.625971 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.626010 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.626048 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.626078 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.627233 [DEBUG] http: Request GET /v1/agent/services (3.837484509s) from=127.0.0.1:58552
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.633310 [DEBUG] http: Request GET /v1/agent/service/one-sidecar-sidecar-proxy (1.483368ms) from=127.0.0.1:58552
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.635161 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.635641 [DEBUG] consul: Skipping self join check for "Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0" since the cluster is too small
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.635850 [INFO] consul: member 'Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0' joined, marking health alive
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.640944 [DEBUG] http: Request GET /v1/agent/service/one-sidecar-sidecar-proxy (6.792825ms) from=127.0.0.1:58556
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.644596 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.645189 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.645931 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.822218 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.914030 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.915159 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.915231 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.915292 [INFO] agent: Stopping DNS server 127.0.0.1:10055 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.915427 [INFO] agent: Stopping DNS server 127.0.0.1:10055 (udp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.915580 [INFO] agent: Stopping HTTP server 127.0.0.1:10056 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:38.915771 [ERR] consul: failed to reconcile member: {Node c414ca31-4868-8aa9-4ea0-1145b7a36ec0 127.0.0.1 10058 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:c414ca31-4868-8aa9-4ea0-1145b7a36ec0 port:10060 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:10059] alive 1 5 2 2 5 4}: leadership lost while committing log
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:39.915922 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:10056 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:39.915996 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/12/06 06:33:39.916030 [INFO] agent: Endpoints down
--- PASS: TestCommandConfigWatcher (30.61s)
    --- PASS: TestCommandConfigWatcher/-service_flag_only (3.36s)
    --- PASS: TestCommandConfigWatcher/-service_flag_with_upstreams (3.25s)
    --- PASS: TestCommandConfigWatcher/-service_flag_with_-service-addr (2.50s)
    --- PASS: TestCommandConfigWatcher/-service,_-service-addr,_-listen (2.43s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_no_sidecar (3.35s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars (6.66s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_non-existent (1.92s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_one_sidecar (7.13s)
PASS
ok  	github.com/hashicorp/consul/command/connect/proxy	30.956s
=== RUN   TestDebugCommand_noTabs
=== PAUSE TestDebugCommand_noTabs
=== RUN   TestDebugCommand
--- SKIP: TestDebugCommand (0.00s)
    debug_test.go:29: DM-skipped
=== RUN   TestDebugCommand_Archive
=== PAUSE TestDebugCommand_Archive
=== RUN   TestDebugCommand_ArgsBad
=== PAUSE TestDebugCommand_ArgsBad
=== RUN   TestDebugCommand_OutputPathBad
=== PAUSE TestDebugCommand_OutputPathBad
=== RUN   TestDebugCommand_OutputPathExists
=== PAUSE TestDebugCommand_OutputPathExists
=== RUN   TestDebugCommand_CaptureTargets
=== PAUSE TestDebugCommand_CaptureTargets
=== RUN   TestDebugCommand_ProfilesExist
=== PAUSE TestDebugCommand_ProfilesExist
=== RUN   TestDebugCommand_ValidateTiming
=== PAUSE TestDebugCommand_ValidateTiming
=== RUN   TestDebugCommand_DebugDisabled
=== PAUSE TestDebugCommand_DebugDisabled
=== CONT  TestDebugCommand_noTabs
=== CONT  TestDebugCommand_DebugDisabled
=== CONT  TestDebugCommand_CaptureTargets
=== CONT  TestDebugCommand_OutputPathBad
--- PASS: TestDebugCommand_noTabs (0.00s)
=== CONT  TestDebugCommand_OutputPathExists
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:09.919892 [WARN] agent: Node name "Node cea91546-f916-c2af-9ae8-7fecc2894263" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:09.920728 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:09.922470 [WARN] agent: Node name "Node 1d94d65f-8e5c-a8a5-1c09-fc50128f16fb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:09.923021 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:09.929513 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:09.931630 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:09.959534 [WARN] agent: Node name "Node 44d00ca9-b4cb-21ea-4a1c-196a3a27af1c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:09.960124 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:09.963493 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:09.973381 [WARN] agent: Node name "Node af5ce46b-3bbd-6457-acfa-7b00c03910d4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:09.975869 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:09.983931 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cea91546-f916-c2af-9ae8-7fecc2894263 Address:127.0.0.1:13006}]
2019/12/06 06:33:11 [INFO]  raft: Node at 127.0.0.1:13006 [Follower] entering Follower state (Leader: "")
2019/12/06 06:33:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1d94d65f-8e5c-a8a5-1c09-fc50128f16fb Address:127.0.0.1:13024}]
2019/12/06 06:33:11 [INFO]  raft: Node at 127.0.0.1:13024 [Follower] entering Follower state (Leader: "")
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:11.949967 [INFO] serf: EventMemberJoin: Node 1d94d65f-8e5c-a8a5-1c09-fc50128f16fb.dc1 127.0.0.1
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:11.953783 [INFO] serf: EventMemberJoin: Node 1d94d65f-8e5c-a8a5-1c09-fc50128f16fb 127.0.0.1
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:11.954994 [INFO] consul: Adding LAN server Node 1d94d65f-8e5c-a8a5-1c09-fc50128f16fb (Addr: tcp/127.0.0.1:13024) (DC: dc1)
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:11.955417 [INFO] consul: Handled member-join event for server "Node 1d94d65f-8e5c-a8a5-1c09-fc50128f16fb.dc1" in area "wan"
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:11.956396 [INFO] agent: Started DNS server 127.0.0.1:13019 (tcp)
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:11.956817 [INFO] agent: Started DNS server 127.0.0.1:13019 (udp)
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:11.960182 [INFO] agent: Started HTTP server on 127.0.0.1:13020 (tcp)
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:11.960380 [INFO] agent: started state syncer
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:11.962664 [INFO] serf: EventMemberJoin: Node cea91546-f916-c2af-9ae8-7fecc2894263.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:11.969067 [INFO] serf: EventMemberJoin: Node cea91546-f916-c2af-9ae8-7fecc2894263 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:11.970696 [INFO] consul: Adding LAN server Node cea91546-f916-c2af-9ae8-7fecc2894263 (Addr: tcp/127.0.0.1:13006) (DC: dc1)
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:11.971057 [INFO] consul: Handled member-join event for server "Node cea91546-f916-c2af-9ae8-7fecc2894263.dc1" in area "wan"
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:11.971817 [INFO] agent: Started DNS server 127.0.0.1:13001 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:11.971900 [INFO] agent: Started DNS server 127.0.0.1:13001 (udp)
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:11.975006 [INFO] agent: Started HTTP server on 127.0.0.1:13002 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:11.975200 [INFO] agent: started state syncer
2019/12/06 06:33:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:13006 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:13024 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:44d00ca9-b4cb-21ea-4a1c-196a3a27af1c Address:127.0.0.1:13018}]
2019/12/06 06:33:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:af5ce46b-3bbd-6457-acfa-7b00c03910d4 Address:127.0.0.1:13012}]
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:13012 [Follower] entering Follower state (Leader: "")
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:13018 [Follower] entering Follower state (Leader: "")
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:12.386527 [INFO] serf: EventMemberJoin: Node af5ce46b-3bbd-6457-acfa-7b00c03910d4.dc1 127.0.0.1
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:12.386711 [INFO] serf: EventMemberJoin: Node 44d00ca9-b4cb-21ea-4a1c-196a3a27af1c.dc1 127.0.0.1
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:12.389815 [INFO] serf: EventMemberJoin: Node 44d00ca9-b4cb-21ea-4a1c-196a3a27af1c 127.0.0.1
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:12.399759 [INFO] agent: Started DNS server 127.0.0.1:13013 (udp)
2019/12/06 06:33:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:13012 [Candidate] entering Candidate state in term 2
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:12.402702 [INFO] serf: EventMemberJoin: Node af5ce46b-3bbd-6457-acfa-7b00c03910d4 127.0.0.1
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:12.405227 [INFO] agent: Started DNS server 127.0.0.1:13007 (udp)
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:12.409137 [INFO] agent: Started DNS server 127.0.0.1:13013 (tcp)
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:12.415274 [INFO] consul: Adding LAN server Node 44d00ca9-b4cb-21ea-4a1c-196a3a27af1c (Addr: tcp/127.0.0.1:13018) (DC: dc1)
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:12.415592 [INFO] agent: Started HTTP server on 127.0.0.1:13014 (tcp)
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:12.415693 [INFO] agent: started state syncer
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:12.423615 [INFO] consul: Handled member-join event for server "Node af5ce46b-3bbd-6457-acfa-7b00c03910d4.dc1" in area "wan"
2019/12/06 06:33:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:13018 [Candidate] entering Candidate state in term 2
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:12.429388 [INFO] consul: Handled member-join event for server "Node 44d00ca9-b4cb-21ea-4a1c-196a3a27af1c.dc1" in area "wan"
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:12.433038 [INFO] consul: Adding LAN server Node af5ce46b-3bbd-6457-acfa-7b00c03910d4 (Addr: tcp/127.0.0.1:13012) (DC: dc1)
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:12.433213 [INFO] agent: Started DNS server 127.0.0.1:13007 (tcp)
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:12.437429 [INFO] agent: Started HTTP server on 127.0.0.1:13008 (tcp)
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:12.437567 [INFO] agent: started state syncer
2019/12/06 06:33:12 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:13006 [Leader] entering Leader state
2019/12/06 06:33:12 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:12 [INFO]  raft: Node at 127.0.0.1:13024 [Leader] entering Leader state
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:12.935256 [INFO] consul: cluster leadership acquired
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:12.935807 [INFO] consul: New leader elected: Node 1d94d65f-8e5c-a8a5-1c09-fc50128f16fb
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:12.936047 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:12.936422 [INFO] consul: New leader elected: Node cea91546-f916-c2af-9ae8-7fecc2894263
2019/12/06 06:33:13 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:13 [INFO]  raft: Node at 127.0.0.1:13018 [Leader] entering Leader state
2019/12/06 06:33:13 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:13 [INFO]  raft: Node at 127.0.0.1:13012 [Leader] entering Leader state
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:13.376004 [INFO] consul: cluster leadership acquired
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:13.376495 [INFO] consul: New leader elected: Node 44d00ca9-b4cb-21ea-4a1c-196a3a27af1c
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:13.376816 [INFO] consul: cluster leadership acquired
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:13.377169 [INFO] consul: New leader elected: Node af5ce46b-3bbd-6457-acfa-7b00c03910d4
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:13.497892 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:13.501552 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:13.501661 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:13.779666 [DEBUG] http: Request GET /v1/agent/self (253.761897ms) from=127.0.0.1:58772
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:13.789121 [DEBUG] http: Request GET /v1/agent/self (274.659716ms) from=127.0.0.1:52582
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:13.864679 [INFO] agent: Synced node info
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:13.870257 [INFO] agent: Synced node info
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.171902 [DEBUG] http: Request GET /v1/agent/self (258.426338ms) from=127.0.0.1:58508
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:14.175663 [DEBUG] http: Request GET /v1/agent/self (282.99791ms) from=127.0.0.1:43664
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:14.188673 [INFO] agent: Requesting shutdown
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:14.188800 [INFO] consul: shutting down server
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:14.198339 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.196551 [INFO] agent: Requesting shutdown
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.200883 [INFO] consul: shutting down server
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.201186 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.618924 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:14.622238 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.789890 [INFO] manager: shutting down
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:14.793054 [INFO] manager: shutting down
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.793693 [INFO] agent: consul server down
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.793753 [INFO] agent: shutdown complete
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.793825 [INFO] agent: Stopping DNS server 127.0.0.1:13013 (tcp)
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.794018 [INFO] agent: Stopping DNS server 127.0.0.1:13013 (udp)
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.794244 [INFO] agent: Stopping HTTP server 127.0.0.1:13014 (tcp)
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.794957 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.795103 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.795316 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.795410 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.795471 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.795533 [ERR] consul: failed to transfer leadership in 3 attempts
TestDebugCommand_OutputPathExists - 2019/12/06 06:33:14.795346 [INFO] agent: Endpoints down
=== CONT  TestDebugCommand_ValidateTiming
--- PASS: TestDebugCommand_OutputPathExists (5.04s)
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:14.918074 [WARN] agent: Node name "Node e252b810-32b6-d8a5-ba9c-7add09cf237d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:14.918656 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:14.921684 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:15.050192 [DEBUG] agent: Node info in sync
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:15.050343 [DEBUG] agent: Node info in sync
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:15.268609 [INFO] agent: consul server down
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:15.268683 [INFO] agent: shutdown complete
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:15.268750 [INFO] agent: Stopping DNS server 127.0.0.1:13007 (tcp)
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:15.268991 [INFO] agent: Stopping DNS server 127.0.0.1:13007 (udp)
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:15.269242 [INFO] agent: Stopping HTTP server 127.0.0.1:13008 (tcp)
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:15.269568 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:15.269855 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_OutputPathBad - 2019/12/06 06:33:15.269906 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_OutputPathBad (5.52s)
=== CONT  TestDebugCommand_ArgsBad
--- PASS: TestDebugCommand_ArgsBad (0.01s)
=== CONT  TestDebugCommand_ProfilesExist
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:15.383123 [WARN] agent: Node name "Node bfa67fcf-1f04-227e-aa1d-117c15edea03" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:15.383683 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:15.386189 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:15.474364 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:15.474977 [DEBUG] consul: Skipping self join check for "Node cea91546-f916-c2af-9ae8-7fecc2894263" since the cluster is too small
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:15.475165 [INFO] consul: member 'Node cea91546-f916-c2af-9ae8-7fecc2894263' joined, marking health alive
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:15.489506 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:15.490929 [DEBUG] consul: Skipping self join check for "Node 1d94d65f-8e5c-a8a5-1c09-fc50128f16fb" since the cluster is too small
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:15.492877 [INFO] consul: member 'Node 1d94d65f-8e5c-a8a5-1c09-fc50128f16fb' joined, marking health alive
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:15.588180 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:15.780684 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:15.783469 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:16.009875 [DEBUG] http: Request GET /v1/agent/host (2.204873902s) from=127.0.0.1:58772
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:16.017207 [DEBUG] http: Request GET /v1/agent/host (2.204380558s) from=127.0.0.1:52582
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:16.029399 [DEBUG] http: Request GET /v1/agent/self (7.005497ms) from=127.0.0.1:58772
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:16.030691 [DEBUG] http: Request GET /v1/agent/self (8.354528ms) from=127.0.0.1:52582
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:16.041936 [DEBUG] http: Request GET /v1/agent/members?wan=1 (903.021µs) from=127.0.0.1:52582
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:16.044716 [DEBUG] http: Request GET /v1/agent/members?wan=1 (1.936712ms) from=127.0.0.1:58772
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:16.060008 [DEBUG] http: Request GET /v1/agent/metrics (2.434724ms) from=127.0.0.1:52596
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:16.063599 [DEBUG] http: Request GET /v1/agent/metrics (1.15036ms) from=127.0.0.1:58786
2019/12/06 06:33:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e252b810-32b6-d8a5-ba9c-7add09cf237d Address:127.0.0.1:13030}]
2019/12/06 06:33:16 [INFO]  raft: Node at 127.0.0.1:13030 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:16.513826 [INFO] serf: EventMemberJoin: Node e252b810-32b6-d8a5-ba9c-7add09cf237d.dc1 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:16.522588 [INFO] serf: EventMemberJoin: Node e252b810-32b6-d8a5-ba9c-7add09cf237d 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:16.524518 [INFO] consul: Adding LAN server Node e252b810-32b6-d8a5-ba9c-7add09cf237d (Addr: tcp/127.0.0.1:13030) (DC: dc1)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:16.524825 [INFO] consul: Handled member-join event for server "Node e252b810-32b6-d8a5-ba9c-7add09cf237d.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:16.527563 [INFO] agent: Started DNS server 127.0.0.1:13025 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:16.527663 [INFO] agent: Started DNS server 127.0.0.1:13025 (udp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:16.530257 [INFO] agent: Started HTTP server on 127.0.0.1:13026 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:16.530381 [INFO] agent: started state syncer
2019/12/06 06:33:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:16 [INFO]  raft: Node at 127.0.0.1:13030 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bfa67fcf-1f04-227e-aa1d-117c15edea03 Address:127.0.0.1:13036}]
2019/12/06 06:33:16 [INFO]  raft: Node at 127.0.0.1:13036 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:16.794845 [INFO] serf: EventMemberJoin: Node bfa67fcf-1f04-227e-aa1d-117c15edea03.dc1 127.0.0.1
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:16.803032 [INFO] serf: EventMemberJoin: Node bfa67fcf-1f04-227e-aa1d-117c15edea03 127.0.0.1
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:16.808507 [INFO] consul: Adding LAN server Node bfa67fcf-1f04-227e-aa1d-117c15edea03 (Addr: tcp/127.0.0.1:13036) (DC: dc1)
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:16.808734 [INFO] consul: Handled member-join event for server "Node bfa67fcf-1f04-227e-aa1d-117c15edea03.dc1" in area "wan"
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:16.811604 [INFO] agent: Started DNS server 127.0.0.1:13031 (tcp)
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:16.811861 [INFO] agent: Started DNS server 127.0.0.1:13031 (udp)
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:16.821038 [INFO] agent: Started HTTP server on 127.0.0.1:13032 (tcp)
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:16.821964 [INFO] agent: started state syncer
2019/12/06 06:33:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:16 [INFO]  raft: Node at 127.0.0.1:13036 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:17 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:17 [INFO]  raft: Node at 127.0.0.1:13030 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:17.282454 [INFO] consul: cluster leadership acquired
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:17.282991 [INFO] consul: New leader elected: Node e252b810-32b6-d8a5-ba9c-7add09cf237d
2019/12/06 06:33:17 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:17 [INFO]  raft: Node at 127.0.0.1:13036 [Leader] entering Leader state
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:17.582379 [INFO] consul: cluster leadership acquired
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:17.582874 [INFO] consul: New leader elected: Node bfa67fcf-1f04-227e-aa1d-117c15edea03
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:17.791875 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:17.908178 [WARN] agent: Node name "Node 05deefe5-d626-c561-165f-0dbe41cfb99d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:17.908758 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:17.911881 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:18.006277 [INFO] agent: Synced node info
/tmp/consul-test/TestDebugCommand_ProfilesExist-debug597544479/debug
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.161040 [INFO] agent: Requesting shutdown
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.161145 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.161543 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.252556 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:18.260370 [DEBUG] http: Request GET /v1/agent/self (232.845744ms) from=127.0.0.1:33302
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.414137 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.415995 [INFO] agent: consul server down
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.424587 [INFO] agent: shutdown complete
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.425626 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.426706 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (udp)
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:18.427652 [INFO] agent: Stopping HTTP server 127.0.0.1:13002 (tcp)
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.079566 [INFO] agent: Requesting shutdown
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.079788 [INFO] consul: shutting down server
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.082790 [WARN] serf: Shutdown without a Leave
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.213770 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.225736 [DEBUG] agent: Node info in sync
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.228021 [DEBUG] agent: Node info in sync
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:19.256293 [DEBUG] agent: Node info in sync
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:19.256455 [DEBUG] agent: Node info in sync
2019/12/06 06:33:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:05deefe5-d626-c561-165f-0dbe41cfb99d Address:127.0.0.1:13042}]
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.281172 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.281899 [DEBUG] consul: Skipping self join check for "Node e252b810-32b6-d8a5-ba9c-7add09cf237d" since the cluster is too small
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.282099 [INFO] consul: member 'Node e252b810-32b6-d8a5-ba9c-7add09cf237d' joined, marking health alive
2019/12/06 06:33:19 [INFO]  raft: Node at 127.0.0.1:13042 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.286151 [INFO] serf: EventMemberJoin: Node 05deefe5-d626-c561-165f-0dbe41cfb99d.dc1 127.0.0.1
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.287461 [INFO] manager: shutting down
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.288647 [INFO] agent: consul server down
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.288731 [INFO] agent: shutdown complete
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.288870 [INFO] agent: Stopping DNS server 127.0.0.1:13019 (tcp)
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.289164 [INFO] agent: Stopping DNS server 127.0.0.1:13019 (udp)
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:19.289558 [INFO] agent: Stopping HTTP server 127.0.0.1:13020 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.293459 [INFO] serf: EventMemberJoin: Node 05deefe5-d626-c561-165f-0dbe41cfb99d 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.296707 [INFO] consul: Adding LAN server Node 05deefe5-d626-c561-165f-0dbe41cfb99d (Addr: tcp/127.0.0.1:13042) (DC: dc1)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.297561 [INFO] consul: Handled member-join event for server "Node 05deefe5-d626-c561-165f-0dbe41cfb99d.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.301321 [INFO] agent: Started DNS server 127.0.0.1:13037 (udp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.301453 [INFO] agent: Started DNS server 127.0.0.1:13037 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.304581 [INFO] agent: Started HTTP server on 127.0.0.1:13038 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:19.305008 [INFO] agent: started state syncer
2019/12/06 06:33:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:19 [INFO]  raft: Node at 127.0.0.1:13042 [Candidate] entering Candidate state in term 2
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:19.428233 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:13002 (tcp)
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:19.428317 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_CaptureTargets - 2019/12/06 06:33:19.428355 [INFO] agent: Endpoints down
--- FAIL: TestDebugCommand_CaptureTargets (9.68s)
    debug_test.go:317: all-but-pprof: output data should exist for */consul.log
=== CONT  TestDebugCommand_Archive
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_Archive - 2019/12/06 06:33:19.572942 [WARN] agent: Node name "Node c761d42b-c306-381c-c0e4-1acfe06a061b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_Archive - 2019/12/06 06:33:19.573497 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_Archive - 2019/12/06 06:33:19.582265 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:19.664877 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:19.665566 [DEBUG] consul: Skipping self join check for "Node bfa67fcf-1f04-227e-aa1d-117c15edea03" since the cluster is too small
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:19.665778 [INFO] consul: member 'Node bfa67fcf-1f04-227e-aa1d-117c15edea03' joined, marking health alive
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.010938 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:20 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:20 [INFO]  raft: Node at 127.0.0.1:13042 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.018756 [INFO] consul: cluster leadership acquired
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.019727 [INFO] consul: New leader elected: Node 05deefe5-d626-c561-165f-0dbe41cfb99d
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:20.267315 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:20.295876 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:13020 (tcp)
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:20.295996 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_DebugDisabled - 2019/12/06 06:33:20.296068 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_DebugDisabled (10.55s)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.356470 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.439673 [WARN] agent: Node name "Node 3798f6b7-bc47-f205-cd53-2bf9938bf906" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.440314 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.442879 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c761d42b-c306-381c-c0e4-1acfe06a061b Address:127.0.0.1:13048}]
TestDebugCommand_Archive - 2019/12/06 06:33:20.707454 [INFO] serf: EventMemberJoin: Node c761d42b-c306-381c-c0e4-1acfe06a061b.dc1 127.0.0.1
2019/12/06 06:33:20 [INFO]  raft: Node at 127.0.0.1:13048 [Follower] entering Follower state (Leader: "")
TestDebugCommand_Archive - 2019/12/06 06:33:20.711962 [INFO] serf: EventMemberJoin: Node c761d42b-c306-381c-c0e4-1acfe06a061b 127.0.0.1
TestDebugCommand_Archive - 2019/12/06 06:33:20.713420 [INFO] agent: Started DNS server 127.0.0.1:13043 (udp)
TestDebugCommand_Archive - 2019/12/06 06:33:20.713933 [INFO] consul: Adding LAN server Node c761d42b-c306-381c-c0e4-1acfe06a061b (Addr: tcp/127.0.0.1:13048) (DC: dc1)
TestDebugCommand_Archive - 2019/12/06 06:33:20.714433 [INFO] consul: Handled member-join event for server "Node c761d42b-c306-381c-c0e4-1acfe06a061b.dc1" in area "wan"
TestDebugCommand_Archive - 2019/12/06 06:33:20.714499 [INFO] agent: Started DNS server 127.0.0.1:13043 (tcp)
TestDebugCommand_Archive - 2019/12/06 06:33:20.720543 [INFO] agent: Started HTTP server on 127.0.0.1:13044 (tcp)
TestDebugCommand_Archive - 2019/12/06 06:33:20.720693 [INFO] agent: started state syncer
2019/12/06 06:33:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:20 [INFO]  raft: Node at 127.0.0.1:13048 [Candidate] entering Candidate state in term 2
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.778239 [DEBUG] agent: Node info in sync
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:20.778355 [DEBUG] agent: Node info in sync
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.293461 [INFO] agent: Requesting shutdown
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.294471 [INFO] consul: shutting down server
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.295859 [WARN] serf: Shutdown without a Leave
2019/12/06 06:33:21 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:21 [INFO]  raft: Node at 127.0.0.1:13048 [Leader] entering Leader state
TestDebugCommand_Archive - 2019/12/06 06:33:21.348997 [INFO] consul: cluster leadership acquired
TestDebugCommand_Archive - 2019/12/06 06:33:21.349762 [INFO] consul: New leader elected: Node c761d42b-c306-381c-c0e4-1acfe06a061b
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.430716 [WARN] serf: Shutdown without a Leave
2019/12/06 06:33:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3798f6b7-bc47-f205-cd53-2bf9938bf906 Address:127.0.0.1:13054}]
2019/12/06 06:33:21 [INFO]  raft: Node at 127.0.0.1:13054 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.524489 [INFO] manager: shutting down
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.525749 [INFO] agent: consul server down
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.525843 [INFO] agent: shutdown complete
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.525989 [INFO] agent: Stopping DNS server 127.0.0.1:13031 (tcp)
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.526321 [INFO] agent: Stopping DNS server 127.0.0.1:13031 (udp)
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:21.526625 [INFO] agent: Stopping HTTP server 127.0.0.1:13032 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.527925 [INFO] serf: EventMemberJoin: Node 3798f6b7-bc47-f205-cd53-2bf9938bf906.dc1 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.534688 [INFO] serf: EventMemberJoin: Node 3798f6b7-bc47-f205-cd53-2bf9938bf906 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.536155 [INFO] consul: Adding LAN server Node 3798f6b7-bc47-f205-cd53-2bf9938bf906 (Addr: tcp/127.0.0.1:13054) (DC: dc1)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.540327 [INFO] consul: Handled member-join event for server "Node 3798f6b7-bc47-f205-cd53-2bf9938bf906.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.546062 [INFO] agent: Started DNS server 127.0.0.1:13049 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.546352 [INFO] agent: Started DNS server 127.0.0.1:13049 (udp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.552276 [INFO] agent: Started HTTP server on 127.0.0.1:13050 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.552500 [INFO] agent: started state syncer
2019/12/06 06:33:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:21 [INFO]  raft: Node at 127.0.0.1:13054 [Candidate] entering Candidate state in term 2
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.598672 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.599782 [DEBUG] consul: Skipping self join check for "Node 05deefe5-d626-c561-165f-0dbe41cfb99d" since the cluster is too small
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:21.600003 [INFO] consul: member 'Node 05deefe5-d626-c561-165f-0dbe41cfb99d' joined, marking health alive
TestDebugCommand_Archive - 2019/12/06 06:33:21.906578 [INFO] agent: Synced node info
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:22.027484 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ProfilesExist - 2019/12/06 06:33:22.027560 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_ProfilesExist (6.76s)
TestDebugCommand_Archive - 2019/12/06 06:33:22.146750 [DEBUG] http: Request GET /v1/agent/self (215.398339ms) from=127.0.0.1:51958
2019/12/06 06:33:22 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:22 [INFO]  raft: Node at 127.0.0.1:13054 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.206640 [INFO] consul: New leader elected: Node 3798f6b7-bc47-f205-cd53-2bf9938bf906
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.207400 [INFO] consul: cluster leadership acquired
TestDebugCommand_Archive - 2019/12/06 06:33:22.298692 [DEBUG] http: Request GET /v1/agent/self (135.340812ms) from=127.0.0.1:51958
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.551732 [INFO] agent: Synced node info
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.551867 [DEBUG] agent: Node info in sync
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.569559 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.570468 [INFO] consul: shutting down server
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.570909 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.621048 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:22.821624 [DEBUG] agent: Node info in sync
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.190058 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/12/06 06:33:24.190861 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_Archive - 2019/12/06 06:33:24.196016 [INFO] agent: Requesting shutdown
TestDebugCommand_Archive - 2019/12/06 06:33:24.196291 [INFO] consul: shutting down server
TestDebugCommand_Archive - 2019/12/06 06:33:24.196540 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/12/06 06:33:24.200519 [WARN] consul: error getting server health from "Node c761d42b-c306-381c-c0e4-1acfe06a061b": rpc error making call: EOF
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.347086 [INFO] manager: shutting down
TestDebugCommand_Archive - 2019/12/06 06:33:24.390806 [DEBUG] agent: Node info in sync
TestDebugCommand_Archive - 2019/12/06 06:33:24.390922 [DEBUG] agent: Node info in sync
TestDebugCommand_Archive - 2019/12/06 06:33:24.463656 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.465032 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.465340 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.465492 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.465546 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.465594 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.465636 [ERR] consul: failed to transfer leadership in 3 attempts
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.466765 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.467014 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.467414 [INFO] agent: Stopping DNS server 127.0.0.1:13049 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.468026 [INFO] agent: Stopping DNS server 127.0.0.1:13049 (udp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.468477 [INFO] agent: Stopping HTTP server 127.0.0.1:13050 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.468685 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.468754 [INFO] agent: Endpoints down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.499102 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.499257 [INFO] consul: shutting down server
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.499312 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/12/06 06:33:24.972055 [INFO] manager: shutting down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:24.972348 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/12/06 06:33:24.973124 [INFO] agent: consul server down
TestDebugCommand_Archive - 2019/12/06 06:33:24.973190 [INFO] agent: shutdown complete
TestDebugCommand_Archive - 2019/12/06 06:33:24.973317 [INFO] agent: Stopping DNS server 127.0.0.1:13043 (tcp)
TestDebugCommand_Archive - 2019/12/06 06:33:24.973529 [INFO] agent: Stopping DNS server 127.0.0.1:13043 (udp)
TestDebugCommand_Archive - 2019/12/06 06:33:24.973720 [INFO] agent: Stopping HTTP server 127.0.0.1:13044 (tcp)
TestDebugCommand_Archive - 2019/12/06 06:33:24.974374 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_Archive - 2019/12/06 06:33:24.974472 [INFO] agent: Endpoints down
TestDebugCommand_Archive - 2019/12/06 06:33:24.975825 [ERR] connect: Apply failed raft is already shutdown
TestDebugCommand_Archive - 2019/12/06 06:33:24.975897 [ERR] consul: failed to establish leadership: raft is already shutdown
--- PASS: TestDebugCommand_Archive (5.54s)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.113793 [INFO] manager: shutting down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.114793 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.114863 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.114923 [INFO] agent: Stopping DNS server 127.0.0.1:13037 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.115094 [INFO] agent: Stopping DNS server 127.0.0.1:13037 (udp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.115270 [INFO] agent: Stopping HTTP server 127.0.0.1:13038 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.115501 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.115580 [INFO] agent: Endpoints down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.116214 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.116301 [INFO] consul: shutting down server
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.116354 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/12/06 06:33:25.190968 [WARN] consul: error getting server health from "Node c761d42b-c306-381c-c0e4-1acfe06a061b": context deadline exceeded
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.247317 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.388904 [INFO] manager: shutting down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.389917 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.389987 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.390049 [INFO] agent: Stopping DNS server 127.0.0.1:13025 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.390209 [INFO] agent: Stopping DNS server 127.0.0.1:13025 (udp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.390366 [INFO] agent: Stopping HTTP server 127.0.0.1:13026 (tcp)
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.390555 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/12/06 06:33:25.390622 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_ValidateTiming (10.59s)
FAIL
FAIL	github.com/hashicorp/consul/command/debug	15.999s
=== RUN   TestEventCommand_noTabs
=== PAUSE TestEventCommand_noTabs
=== RUN   TestEventCommand
=== PAUSE TestEventCommand
=== CONT  TestEventCommand_noTabs
=== CONT  TestEventCommand
--- PASS: TestEventCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestEventCommand - 2019/12/06 06:33:36.629944 [WARN] agent: Node name "Node ee6540c8-346a-b5dc-0283-d5ad76c39633" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventCommand - 2019/12/06 06:33:36.630886 [DEBUG] tlsutil: Update with version 1
TestEventCommand - 2019/12/06 06:33:36.639294 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ee6540c8-346a-b5dc-0283-d5ad76c39633 Address:127.0.0.1:47506}]
2019/12/06 06:33:38 [INFO]  raft: Node at 127.0.0.1:47506 [Follower] entering Follower state (Leader: "")
TestEventCommand - 2019/12/06 06:33:38.828051 [INFO] serf: EventMemberJoin: Node ee6540c8-346a-b5dc-0283-d5ad76c39633.dc1 127.0.0.1
TestEventCommand - 2019/12/06 06:33:38.832392 [INFO] serf: EventMemberJoin: Node ee6540c8-346a-b5dc-0283-d5ad76c39633 127.0.0.1
TestEventCommand - 2019/12/06 06:33:38.833307 [INFO] consul: Adding LAN server Node ee6540c8-346a-b5dc-0283-d5ad76c39633 (Addr: tcp/127.0.0.1:47506) (DC: dc1)
TestEventCommand - 2019/12/06 06:33:38.833355 [INFO] consul: Handled member-join event for server "Node ee6540c8-346a-b5dc-0283-d5ad76c39633.dc1" in area "wan"
TestEventCommand - 2019/12/06 06:33:38.834402 [INFO] agent: Started DNS server 127.0.0.1:47501 (tcp)
TestEventCommand - 2019/12/06 06:33:38.834943 [INFO] agent: Started DNS server 127.0.0.1:47501 (udp)
TestEventCommand - 2019/12/06 06:33:38.837598 [INFO] agent: Started HTTP server on 127.0.0.1:47502 (tcp)
TestEventCommand - 2019/12/06 06:33:38.837773 [INFO] agent: started state syncer
2019/12/06 06:33:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:38 [INFO]  raft: Node at 127.0.0.1:47506 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:40 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:40 [INFO]  raft: Node at 127.0.0.1:47506 [Leader] entering Leader state
TestEventCommand - 2019/12/06 06:33:40.113512 [INFO] consul: cluster leadership acquired
TestEventCommand - 2019/12/06 06:33:40.114231 [INFO] consul: New leader elected: Node ee6540c8-346a-b5dc-0283-d5ad76c39633
TestEventCommand - 2019/12/06 06:33:40.582083 [INFO] agent: Synced node info
TestEventCommand - 2019/12/06 06:33:40.582261 [DEBUG] agent: Node info in sync
TestEventCommand - 2019/12/06 06:33:40.593990 [DEBUG] http: Request GET /v1/agent/self (427.58827ms) from=127.0.0.1:55280
TestEventCommand - 2019/12/06 06:33:40.621140 [DEBUG] http: Request PUT /v1/event/fire/cmd (2.42739ms) from=127.0.0.1:55280
TestEventCommand - 2019/12/06 06:33:40.623371 [DEBUG] consul: User event: cmd
TestEventCommand - 2019/12/06 06:33:40.623613 [DEBUG] agent: new event: cmd (3694f377-1693-0db4-5fd3-378790ff3f77)
TestEventCommand - 2019/12/06 06:33:40.625181 [INFO] agent: Requesting shutdown
TestEventCommand - 2019/12/06 06:33:40.625277 [INFO] consul: shutting down server
TestEventCommand - 2019/12/06 06:33:40.625335 [WARN] serf: Shutdown without a Leave
TestEventCommand - 2019/12/06 06:33:40.780700 [WARN] serf: Shutdown without a Leave
TestEventCommand - 2019/12/06 06:33:40.922465 [INFO] manager: shutting down
TestEventCommand - 2019/12/06 06:33:41.055723 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestEventCommand - 2019/12/06 06:33:41.056038 [INFO] agent: consul server down
TestEventCommand - 2019/12/06 06:33:41.056093 [INFO] agent: shutdown complete
TestEventCommand - 2019/12/06 06:33:41.056150 [INFO] agent: Stopping DNS server 127.0.0.1:47501 (tcp)
TestEventCommand - 2019/12/06 06:33:41.056284 [INFO] agent: Stopping DNS server 127.0.0.1:47501 (udp)
TestEventCommand - 2019/12/06 06:33:41.056437 [INFO] agent: Stopping HTTP server 127.0.0.1:47502 (tcp)
TestEventCommand - 2019/12/06 06:33:41.056971 [INFO] agent: Waiting for endpoints to shut down
TestEventCommand - 2019/12/06 06:33:41.057052 [INFO] agent: Endpoints down
--- PASS: TestEventCommand (4.51s)
PASS
ok  	github.com/hashicorp/consul/command/event	4.753s
=== RUN   TestExecCommand_noTabs
=== PAUSE TestExecCommand_noTabs
=== RUN   TestExecCommand
=== PAUSE TestExecCommand
=== RUN   TestExecCommand_NoShell
=== PAUSE TestExecCommand_NoShell
=== RUN   TestExecCommand_CrossDC
--- SKIP: TestExecCommand_CrossDC (0.00s)
    exec_test.go:70: DM-skipped
=== RUN   TestExecCommand_Validate
=== PAUSE TestExecCommand_Validate
=== RUN   TestExecCommand_Sessions
=== PAUSE TestExecCommand_Sessions
=== RUN   TestExecCommand_Sessions_Foreign
=== PAUSE TestExecCommand_Sessions_Foreign
=== RUN   TestExecCommand_UploadDestroy
=== PAUSE TestExecCommand_UploadDestroy
=== RUN   TestExecCommand_StreamResults
=== PAUSE TestExecCommand_StreamResults
=== CONT  TestExecCommand_noTabs
=== CONT  TestExecCommand_Sessions_Foreign
=== CONT  TestExecCommand_StreamResults
=== CONT  TestExecCommand_UploadDestroy
--- PASS: TestExecCommand_noTabs (0.01s)
=== CONT  TestExecCommand_Validate
--- PASS: TestExecCommand_Validate (0.00s)
=== CONT  TestExecCommand_Sessions
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:48.190219 [WARN] agent: Node name "Node 7b22f801-9476-6472-3908-7299855647a5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:48.191224 [DEBUG] tlsutil: Update with version 1
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:48.211681 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_UploadDestroy - 2019/12/06 06:33:48.287816 [WARN] agent: Node name "Node 59410d96-7667-36fe-cc9b-dccbdf00705c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_UploadDestroy - 2019/12/06 06:33:48.288299 [DEBUG] tlsutil: Update with version 1
TestExecCommand_UploadDestroy - 2019/12/06 06:33:48.290463 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_Sessions - 2019/12/06 06:33:48.320612 [WARN] agent: Node name "Node fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_Sessions - 2019/12/06 06:33:48.321004 [DEBUG] tlsutil: Update with version 1
TestExecCommand_Sessions - 2019/12/06 06:33:48.323122 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_StreamResults - 2019/12/06 06:33:48.366084 [WARN] agent: Node name "Node 35d99c4e-ae6b-92bb-0f88-f05101dcc1ea" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_StreamResults - 2019/12/06 06:33:48.366648 [DEBUG] tlsutil: Update with version 1
TestExecCommand_StreamResults - 2019/12/06 06:33:48.370062 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:33:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7b22f801-9476-6472-3908-7299855647a5 Address:127.0.0.1:44506}]
2019/12/06 06:33:49 [INFO]  raft: Node at 127.0.0.1:44506 [Follower] entering Follower state (Leader: "")
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:49.424699 [INFO] serf: EventMemberJoin: Node 7b22f801-9476-6472-3908-7299855647a5.dc1 127.0.0.1
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:49.430999 [INFO] serf: EventMemberJoin: Node 7b22f801-9476-6472-3908-7299855647a5 127.0.0.1
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:49.433269 [INFO] consul: Adding LAN server Node 7b22f801-9476-6472-3908-7299855647a5 (Addr: tcp/127.0.0.1:44506) (DC: dc1)
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:49.434155 [INFO] consul: Handled member-join event for server "Node 7b22f801-9476-6472-3908-7299855647a5.dc1" in area "wan"
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:49.438259 [INFO] agent: Started DNS server 127.0.0.1:44501 (tcp)
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:49.438418 [INFO] agent: Started DNS server 127.0.0.1:44501 (udp)
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:49.442163 [INFO] agent: Started HTTP server on 127.0.0.1:44502 (tcp)
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:49.442341 [INFO] agent: started state syncer
2019/12/06 06:33:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:49 [INFO]  raft: Node at 127.0.0.1:44506 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:59410d96-7667-36fe-cc9b-dccbdf00705c Address:127.0.0.1:44518}]
2019/12/06 06:33:49 [INFO]  raft: Node at 127.0.0.1:44518 [Follower] entering Follower state (Leader: "")
TestExecCommand_UploadDestroy - 2019/12/06 06:33:49.571587 [INFO] serf: EventMemberJoin: Node 59410d96-7667-36fe-cc9b-dccbdf00705c.dc1 127.0.0.1
TestExecCommand_UploadDestroy - 2019/12/06 06:33:49.577500 [INFO] serf: EventMemberJoin: Node 59410d96-7667-36fe-cc9b-dccbdf00705c 127.0.0.1
TestExecCommand_UploadDestroy - 2019/12/06 06:33:49.580117 [INFO] agent: Started DNS server 127.0.0.1:44513 (udp)
TestExecCommand_UploadDestroy - 2019/12/06 06:33:49.580553 [INFO] consul: Handled member-join event for server "Node 59410d96-7667-36fe-cc9b-dccbdf00705c.dc1" in area "wan"
TestExecCommand_UploadDestroy - 2019/12/06 06:33:49.580748 [INFO] agent: Started DNS server 127.0.0.1:44513 (tcp)
TestExecCommand_UploadDestroy - 2019/12/06 06:33:49.580895 [INFO] consul: Adding LAN server Node 59410d96-7667-36fe-cc9b-dccbdf00705c (Addr: tcp/127.0.0.1:44518) (DC: dc1)
TestExecCommand_UploadDestroy - 2019/12/06 06:33:49.583216 [INFO] agent: Started HTTP server on 127.0.0.1:44514 (tcp)
TestExecCommand_UploadDestroy - 2019/12/06 06:33:49.583418 [INFO] agent: started state syncer
2019/12/06 06:33:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:49 [INFO]  raft: Node at 127.0.0.1:44518 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:35d99c4e-ae6b-92bb-0f88-f05101dcc1ea Address:127.0.0.1:44512}]
2019/12/06 06:33:49 [INFO]  raft: Node at 127.0.0.1:44512 [Follower] entering Follower state (Leader: "")
2019/12/06 06:33:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60 Address:127.0.0.1:44524}]
2019/12/06 06:33:49 [INFO]  raft: Node at 127.0.0.1:44524 [Follower] entering Follower state (Leader: "")
TestExecCommand_Sessions - 2019/12/06 06:33:49.848206 [INFO] serf: EventMemberJoin: Node fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60.dc1 127.0.0.1
TestExecCommand_Sessions - 2019/12/06 06:33:49.851634 [INFO] serf: EventMemberJoin: Node fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60 127.0.0.1
TestExecCommand_Sessions - 2019/12/06 06:33:49.853393 [INFO] consul: Adding LAN server Node fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60 (Addr: tcp/127.0.0.1:44524) (DC: dc1)
TestExecCommand_StreamResults - 2019/12/06 06:33:49.851652 [INFO] serf: EventMemberJoin: Node 35d99c4e-ae6b-92bb-0f88-f05101dcc1ea.dc1 127.0.0.1
TestExecCommand_StreamResults - 2019/12/06 06:33:49.866503 [INFO] serf: EventMemberJoin: Node 35d99c4e-ae6b-92bb-0f88-f05101dcc1ea 127.0.0.1
TestExecCommand_Sessions - 2019/12/06 06:33:49.872744 [INFO] agent: Started DNS server 127.0.0.1:44519 (tcp)
TestExecCommand_Sessions - 2019/12/06 06:33:49.873349 [INFO] consul: Handled member-join event for server "Node fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60.dc1" in area "wan"
TestExecCommand_StreamResults - 2019/12/06 06:33:49.890482 [INFO] consul: Adding LAN server Node 35d99c4e-ae6b-92bb-0f88-f05101dcc1ea (Addr: tcp/127.0.0.1:44512) (DC: dc1)
TestExecCommand_Sessions - 2019/12/06 06:33:49.885342 [INFO] agent: Started DNS server 127.0.0.1:44519 (udp)
TestExecCommand_StreamResults - 2019/12/06 06:33:49.890725 [INFO] agent: Started DNS server 127.0.0.1:44507 (udp)
TestExecCommand_Sessions - 2019/12/06 06:33:49.895087 [INFO] agent: Started HTTP server on 127.0.0.1:44520 (tcp)
TestExecCommand_Sessions - 2019/12/06 06:33:49.895433 [INFO] agent: started state syncer
2019/12/06 06:33:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:49 [INFO]  raft: Node at 127.0.0.1:44512 [Candidate] entering Candidate state in term 2
TestExecCommand_StreamResults - 2019/12/06 06:33:49.896978 [INFO] consul: Handled member-join event for server "Node 35d99c4e-ae6b-92bb-0f88-f05101dcc1ea.dc1" in area "wan"
TestExecCommand_StreamResults - 2019/12/06 06:33:49.897784 [INFO] agent: Started DNS server 127.0.0.1:44507 (tcp)
TestExecCommand_StreamResults - 2019/12/06 06:33:49.900210 [INFO] agent: Started HTTP server on 127.0.0.1:44508 (tcp)
TestExecCommand_StreamResults - 2019/12/06 06:33:49.900333 [INFO] agent: started state syncer
2019/12/06 06:33:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:49 [INFO]  raft: Node at 127.0.0.1:44524 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:50 [INFO]  raft: Node at 127.0.0.1:44506 [Leader] entering Leader state
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.299217 [INFO] consul: cluster leadership acquired
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.299843 [INFO] consul: New leader elected: Node 7b22f801-9476-6472-3908-7299855647a5
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.371255 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (10.380908ms) from=127.0.0.1:37574
2019/12/06 06:33:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:50 [INFO]  raft: Node at 127.0.0.1:44518 [Leader] entering Leader state
TestExecCommand_UploadDestroy - 2019/12/06 06:33:50.376320 [INFO] consul: cluster leadership acquired
TestExecCommand_UploadDestroy - 2019/12/06 06:33:50.376720 [INFO] consul: New leader elected: Node 59410d96-7667-36fe-cc9b-dccbdf00705c
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.403832 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.201694ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.433064 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.684373ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.462815 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.031691ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.505230 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (14.669008ms) from=127.0.0.1:37574
2019/12/06 06:33:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:50 [INFO]  raft: Node at 127.0.0.1:44524 [Leader] entering Leader state
2019/12/06 06:33:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:50 [INFO]  raft: Node at 127.0.0.1:44512 [Leader] entering Leader state
TestExecCommand_StreamResults - 2019/12/06 06:33:50.535637 [INFO] consul: cluster leadership acquired
TestExecCommand_StreamResults - 2019/12/06 06:33:50.536632 [INFO] consul: New leader elected: Node 35d99c4e-ae6b-92bb-0f88-f05101dcc1ea
TestExecCommand_Sessions - 2019/12/06 06:33:50.537045 [INFO] consul: cluster leadership acquired
TestExecCommand_Sessions - 2019/12/06 06:33:50.537618 [INFO] consul: New leader elected: Node fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.540867 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.825042ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.570839 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.86471ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.599433 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (908.021µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.627787 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (755.351µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.656329 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (891.354µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.685133 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (911.688µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.714936 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.894377ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.744466 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.075691ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.772958 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (815.019µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.801558 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (792.352µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.830330 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (915.021µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.858711 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (811.352µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.882030 [INFO] agent: Synced node info
TestExecCommand_UploadDestroy - 2019/12/06 06:33:50.883498 [INFO] agent: Synced node info
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.887355 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (922.688µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.915866 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (892.021µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.945040 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (964.023µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:50.974137 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (906.688µs) from=127.0.0.1:37574
TestExecCommand_StreamResults - 2019/12/06 06:33:50.981774 [INFO] agent: Synced node info
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.002866 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (957.688µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.032602 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.299363ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.061635 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (850.353µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.090570 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (852.687µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.119954 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.107359ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.148969 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.026024ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.177520 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (849.02µs) from=127.0.0.1:37574
TestExecCommand_Sessions - 2019/12/06 06:33:51.190127 [INFO] agent: Synced node info
TestExecCommand_Sessions - 2019/12/06 06:33:51.190263 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.206587 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (995.357µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.235341 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (789.685µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.264168 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.074358ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.293273 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.022357ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.322359 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (888.354µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.352889 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (962.689µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.382589 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (903.354µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.411361 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (846.687µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.440257 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.070692ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.469139 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (900.688µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.498266 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (988.69µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.528232 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (905.687µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.557904 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.017024ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.586714 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (954.022µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.615711 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (985.023µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.644500 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (831.02µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.673478 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (949.689µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.702422 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (874.02µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.731727 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.074025ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.760526 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (814.352µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.789332 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (893.354µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.826034 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (8.637534ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.854684 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.00269ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.883351 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (866.686µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.912046 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (944.688µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.940563 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (775.684µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.969340 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.054358ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:51.998318 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (846.686µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.026859 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (796.352µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.055555 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (907.354µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.084702 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.010023ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.113940 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (887.687µs) from=127.0.0.1:37574
TestExecCommand_UploadDestroy - 2019/12/06 06:33:52.123511 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_UploadDestroy - 2019/12/06 06:33:52.124009 [DEBUG] consul: Skipping self join check for "Node 59410d96-7667-36fe-cc9b-dccbdf00705c" since the cluster is too small
TestExecCommand_UploadDestroy - 2019/12/06 06:33:52.124219 [INFO] consul: member 'Node 59410d96-7667-36fe-cc9b-dccbdf00705c' joined, marking health alive
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.149435 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (884.687µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.177901 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (868.021µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.206704 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (845.353µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.235057 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (687.349µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.263668 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (859.019µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.290444 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.290875 [DEBUG] consul: Skipping self join check for "Node 7b22f801-9476-6472-3908-7299855647a5" since the cluster is too small
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.291022 [INFO] consul: member 'Node 7b22f801-9476-6472-3908-7299855647a5' joined, marking health alive
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.292480 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.125026ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.322220 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (718.35µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.350932 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (900.354µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.379560 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (960.023µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.409117 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (677.349µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.438140 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.029357ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.467428 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (925.355µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.496118 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (846.02µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.524774 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (997.689µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.554840 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (885.354µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.584483 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (899.354µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.618301 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (907.688µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.646915 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (945.356µs) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:52.676502 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.721707ms) from=127.0.0.1:37574
TestExecCommand_StreamResults - 2019/12/06 06:33:52.767304 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_StreamResults - 2019/12/06 06:33:52.767737 [DEBUG] consul: Skipping self join check for "Node 35d99c4e-ae6b-92bb-0f88-f05101dcc1ea" since the cluster is too small
TestExecCommand_StreamResults - 2019/12/06 06:33:52.767888 [INFO] consul: member 'Node 35d99c4e-ae6b-92bb-0f88-f05101dcc1ea' joined, marking health alive
TestExecCommand_Sessions - 2019/12/06 06:33:52.882678 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_UploadDestroy - 2019/12/06 06:33:52.883162 [DEBUG] http: Request PUT /v1/session/create (316.196348ms) from=127.0.0.1:57338
TestExecCommand_Sessions - 2019/12/06 06:33:52.883477 [DEBUG] consul: Skipping self join check for "Node fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60" since the cluster is too small
TestExecCommand_Sessions - 2019/12/06 06:33:52.883676 [INFO] consul: member 'Node fc3e8e4b-a7bb-d3e5-4122-0aecb5414d60' joined, marking health alive
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.032759 [DEBUG] http: Request PUT /v1/session/create (347.101066ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.038606 [DEBUG] http: Request GET /v1/session/info/61121986-2d5d-4209-e16f-19f531695d44 (1.55037ms) from=127.0.0.1:37578
TestExecCommand_StreamResults - 2019/12/06 06:33:53.044148 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestExecCommand_StreamResults - 2019/12/06 06:33:53.044293 [DEBUG] agent: Node info in sync
TestExecCommand_StreamResults - 2019/12/06 06:33:53.044401 [DEBUG] agent: Node info in sync
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.147940 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.148965 [DEBUG] http: Request PUT /v1/kv/_rexec/eced64c0-0924-eade-72c7-789fee0c3563/job?acquire=eced64c0-0924-eade-72c7-789fee0c3563 (262.391764ms) from=127.0.0.1:57338
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.153639 [DEBUG] http: Request GET /v1/kv/_rexec/eced64c0-0924-eade-72c7-789fee0c3563/job (1.28903ms) from=127.0.0.1:57344
TestExecCommand_StreamResults - 2019/12/06 06:33:53.339692 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.341629 [DEBUG] http: Request DELETE /v1/kv/_rexec/eced64c0-0924-eade-72c7-789fee0c3563?recurse= (185.275972ms) from=127.0.0.1:57338
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.341629 [DEBUG] http: Request PUT /v1/session/destroy/61121986-2d5d-4209-e16f-19f531695d44 (286.668995ms) from=127.0.0.1:37574
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.342043 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_StreamResults - 2019/12/06 06:33:53.343704 [DEBUG] http: Request PUT /v1/session/create (302.698035ms) from=127.0.0.1:49952
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.347838 [DEBUG] http: Request GET /v1/kv/_rexec/eced64c0-0924-eade-72c7-789fee0c3563/job (472.678µs) from=127.0.0.1:57348
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.349275 [DEBUG] http: Request GET /v1/session/info/61121986-2d5d-4209-e16f-19f531695d44 (1.210694ms) from=127.0.0.1:37588
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.349685 [INFO] agent: Requesting shutdown
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.349756 [INFO] consul: shutting down server
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.349803 [WARN] serf: Shutdown without a Leave
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.351107 [INFO] agent: Requesting shutdown
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.351193 [INFO] consul: shutting down server
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.351245 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/12/06 06:33:53.351738 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/?keys=&wait=2000ms (376.009µs) from=127.0.0.1:49952
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.366448 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.366545 [DEBUG] agent: Node info in sync
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.366625 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions - 2019/12/06 06:33:53.375520 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestExecCommand_Sessions - 2019/12/06 06:33:53.375631 [DEBUG] agent: Node info in sync
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.430848 [WARN] serf: Shutdown without a Leave
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.505952 [WARN] serf: Shutdown without a Leave
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.509737 [INFO] manager: shutting down
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.511355 [INFO] agent: consul server down
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.511751 [INFO] agent: shutdown complete
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.512134 [INFO] agent: Stopping DNS server 127.0.0.1:44513 (tcp)
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.512602 [INFO] agent: Stopping DNS server 127.0.0.1:44513 (udp)
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.513229 [INFO] agent: Stopping HTTP server 127.0.0.1:44514 (tcp)
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.515042 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_UploadDestroy - 2019/12/06 06:33:53.515157 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_UploadDestroy (5.45s)
=== CONT  TestExecCommand_NoShell
TestExecCommand_Sessions - 2019/12/06 06:33:53.531227 [DEBUG] http: Request PUT /v1/session/create (359.038677ms) from=127.0.0.1:51682
TestExecCommand_Sessions - 2019/12/06 06:33:53.532231 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions - 2019/12/06 06:33:53.542081 [DEBUG] http: Request GET /v1/session/info/e16f1d39-99cd-8edc-0728-58e4581c7060 (1.066359ms) from=127.0.0.1:51690
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.589348 [INFO] manager: shutting down
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.590121 [INFO] agent: consul server down
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.590177 [INFO] agent: shutdown complete
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.590234 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (tcp)
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.590372 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (udp)
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.590541 [INFO] agent: Stopping HTTP server 127.0.0.1:44502 (tcp)
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.591331 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_Sessions_Foreign - 2019/12/06 06:33:53.591452 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_Sessions_Foreign (5.53s)
=== CONT  TestExecCommand
TestExecCommand_StreamResults - 2019/12/06 06:33:53.657253 [DEBUG] http: Request PUT /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/foo/ack?acquire=4acd5308-aa77-5a36-5f13-2944022284dc (300.81599ms) from=127.0.0.1:49962
TestExecCommand_StreamResults - 2019/12/06 06:33:53.659751 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/?index=1&keys=&wait=2000ms (306.090447ms) from=127.0.0.1:49952
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_NoShell - 2019/12/06 06:33:53.666725 [WARN] agent: Node name "Node 7f32a16c-0c8d-86a1-e208-2f57b5331a76" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_NoShell - 2019/12/06 06:33:53.667699 [DEBUG] tlsutil: Update with version 1
TestExecCommand_NoShell - 2019/12/06 06:33:53.670425 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand - 2019/12/06 06:33:53.720044 [WARN] agent: Node name "Node c271a25d-d9ba-eea8-09b6-66284ff562fd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand - 2019/12/06 06:33:53.720624 [DEBUG] tlsutil: Update with version 1
TestExecCommand - 2019/12/06 06:33:53.723254 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions - 2019/12/06 06:33:53.772038 [DEBUG] http: Request PUT /v1/session/destroy/e16f1d39-99cd-8edc-0728-58e4581c7060 (227.072277ms) from=127.0.0.1:51682
TestExecCommand_Sessions - 2019/12/06 06:33:53.776298 [DEBUG] http: Request GET /v1/session/info/e16f1d39-99cd-8edc-0728-58e4581c7060 (1.034024ms) from=127.0.0.1:51694
TestExecCommand_Sessions - 2019/12/06 06:33:53.777791 [INFO] agent: Requesting shutdown
TestExecCommand_Sessions - 2019/12/06 06:33:53.777874 [INFO] consul: shutting down server
TestExecCommand_Sessions - 2019/12/06 06:33:53.777923 [WARN] serf: Shutdown without a Leave
TestExecCommand_Sessions - 2019/12/06 06:33:53.841402 [WARN] serf: Shutdown without a Leave
TestExecCommand_Sessions - 2019/12/06 06:33:53.922983 [INFO] manager: shutting down
TestExecCommand_Sessions - 2019/12/06 06:33:53.924344 [INFO] agent: consul server down
TestExecCommand_Sessions - 2019/12/06 06:33:53.924461 [INFO] agent: shutdown complete
TestExecCommand_Sessions - 2019/12/06 06:33:53.924532 [INFO] agent: Stopping DNS server 127.0.0.1:44519 (tcp)
TestExecCommand_StreamResults - 2019/12/06 06:33:53.924628 [DEBUG] http: Request PUT /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/foo/exit?acquire=4acd5308-aa77-5a36-5f13-2944022284dc (260.502054ms) from=127.0.0.1:49966
TestExecCommand_Sessions - 2019/12/06 06:33:53.924725 [INFO] agent: Stopping DNS server 127.0.0.1:44519 (udp)
TestExecCommand_Sessions - 2019/12/06 06:33:53.924929 [INFO] agent: Stopping HTTP server 127.0.0.1:44520 (tcp)
TestExecCommand_StreamResults - 2019/12/06 06:33:53.926269 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/?index=12&keys=&wait=2000ms (259.754369ms) from=127.0.0.1:49952
TestExecCommand_Sessions - 2019/12/06 06:33:53.926438 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_Sessions - 2019/12/06 06:33:53.926678 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_Sessions (5.85s)
TestExecCommand_StreamResults - 2019/12/06 06:33:53.931709 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/foo/exit (1.600371ms) from=127.0.0.1:49952
TestExecCommand_StreamResults - 2019/12/06 06:33:54.196630 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/?index=13&keys=&wait=2000ms (261.598412ms) from=127.0.0.1:49952
TestExecCommand_StreamResults - 2019/12/06 06:33:54.198854 [DEBUG] http: Request PUT /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/foo/random?acquire=4acd5308-aa77-5a36-5f13-2944022284dc (256.270289ms) from=127.0.0.1:49970
TestExecCommand_StreamResults - 2019/12/06 06:33:54.457456 [DEBUG] http: Request PUT /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/foo/out/00000?acquire=4acd5308-aa77-5a36-5f13-2944022284dc (247.644755ms) from=127.0.0.1:49972
TestExecCommand_StreamResults - 2019/12/06 06:33:54.459149 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/?index=14&keys=&wait=2000ms (256.696632ms) from=127.0.0.1:49952
TestExecCommand_StreamResults - 2019/12/06 06:33:54.464728 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/foo/out/00000 (882.02µs) from=127.0.0.1:49952
2019/12/06 06:33:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7f32a16c-0c8d-86a1-e208-2f57b5331a76 Address:127.0.0.1:44530}]
TestExecCommand_StreamResults - 2019/12/06 06:33:54.724372 [DEBUG] http: Request PUT /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/foo/out/00001?acquire=4acd5308-aa77-5a36-5f13-2944022284dc (255.719609ms) from=127.0.0.1:49974
2019/12/06 06:33:54 [INFO]  raft: Node at 127.0.0.1:44530 [Follower] entering Follower state (Leader: "")
TestExecCommand_StreamResults - 2019/12/06 06:33:54.727636 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/?index=15&keys=&wait=2000ms (260.421386ms) from=127.0.0.1:49952
TestExecCommand_StreamResults - 2019/12/06 06:33:54.733107 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/foo/out/00001 (980.69µs) from=127.0.0.1:49952
TestExecCommand_NoShell - 2019/12/06 06:33:54.733553 [INFO] serf: EventMemberJoin: Node 7f32a16c-0c8d-86a1-e208-2f57b5331a76.dc1 127.0.0.1
TestExecCommand_StreamResults - 2019/12/06 06:33:54.736287 [INFO] agent: Requesting shutdown
TestExecCommand_StreamResults - 2019/12/06 06:33:54.736386 [INFO] consul: shutting down server
TestExecCommand_StreamResults - 2019/12/06 06:33:54.736436 [WARN] serf: Shutdown without a Leave
TestExecCommand_NoShell - 2019/12/06 06:33:54.737860 [INFO] serf: EventMemberJoin: Node 7f32a16c-0c8d-86a1-e208-2f57b5331a76 127.0.0.1
TestExecCommand_NoShell - 2019/12/06 06:33:54.739260 [INFO] consul: Adding LAN server Node 7f32a16c-0c8d-86a1-e208-2f57b5331a76 (Addr: tcp/127.0.0.1:44530) (DC: dc1)
TestExecCommand_NoShell - 2019/12/06 06:33:54.741986 [INFO] agent: Started DNS server 127.0.0.1:44525 (udp)
TestExecCommand_NoShell - 2019/12/06 06:33:54.745162 [INFO] agent: Started DNS server 127.0.0.1:44525 (tcp)
TestExecCommand_NoShell - 2019/12/06 06:33:54.745238 [INFO] consul: Handled member-join event for server "Node 7f32a16c-0c8d-86a1-e208-2f57b5331a76.dc1" in area "wan"
TestExecCommand_NoShell - 2019/12/06 06:33:54.750314 [INFO] agent: Started HTTP server on 127.0.0.1:44526 (tcp)
TestExecCommand_NoShell - 2019/12/06 06:33:54.750420 [INFO] agent: started state syncer
2019/12/06 06:33:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:54 [INFO]  raft: Node at 127.0.0.1:44530 [Candidate] entering Candidate state in term 2
2019/12/06 06:33:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c271a25d-d9ba-eea8-09b6-66284ff562fd Address:127.0.0.1:44536}]
2019/12/06 06:33:54 [INFO]  raft: Node at 127.0.0.1:44536 [Follower] entering Follower state (Leader: "")
TestExecCommand - 2019/12/06 06:33:54.827060 [INFO] serf: EventMemberJoin: Node c271a25d-d9ba-eea8-09b6-66284ff562fd.dc1 127.0.0.1
TestExecCommand - 2019/12/06 06:33:54.831189 [INFO] serf: EventMemberJoin: Node c271a25d-d9ba-eea8-09b6-66284ff562fd 127.0.0.1
TestExecCommand - 2019/12/06 06:33:54.832664 [INFO] consul: Adding LAN server Node c271a25d-d9ba-eea8-09b6-66284ff562fd (Addr: tcp/127.0.0.1:44536) (DC: dc1)
TestExecCommand - 2019/12/06 06:33:54.832696 [INFO] agent: Started DNS server 127.0.0.1:44531 (udp)
TestExecCommand - 2019/12/06 06:33:54.833333 [INFO] agent: Started DNS server 127.0.0.1:44531 (tcp)
TestExecCommand - 2019/12/06 06:33:54.833759 [INFO] consul: Handled member-join event for server "Node c271a25d-d9ba-eea8-09b6-66284ff562fd.dc1" in area "wan"
TestExecCommand - 2019/12/06 06:33:54.836066 [INFO] agent: Started HTTP server on 127.0.0.1:44532 (tcp)
TestExecCommand - 2019/12/06 06:33:54.836189 [INFO] agent: started state syncer
2019/12/06 06:33:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:33:54 [INFO]  raft: Node at 127.0.0.1:44536 [Candidate] entering Candidate state in term 2
TestExecCommand_StreamResults - 2019/12/06 06:33:54.905939 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/12/06 06:33:55.064618 [INFO] manager: shutting down
TestExecCommand_StreamResults - 2019/12/06 06:33:55.065463 [INFO] agent: consul server down
TestExecCommand_StreamResults - 2019/12/06 06:33:55.065526 [INFO] agent: shutdown complete
TestExecCommand_StreamResults - 2019/12/06 06:33:55.065581 [INFO] agent: Stopping DNS server 127.0.0.1:44507 (tcp)
TestExecCommand_StreamResults - 2019/12/06 06:33:55.065728 [INFO] agent: Stopping DNS server 127.0.0.1:44507 (udp)
TestExecCommand_StreamResults - 2019/12/06 06:33:55.065880 [INFO] agent: Stopping HTTP server 127.0.0.1:44508 (tcp)
2019/12/06 06:33:55 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:55 [INFO]  raft: Node at 127.0.0.1:44530 [Leader] entering Leader state
2019/12/06 06:33:55 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:33:55 [INFO]  raft: Node at 127.0.0.1:44536 [Leader] entering Leader state
TestExecCommand_NoShell - 2019/12/06 06:33:55.383829 [INFO] consul: cluster leadership acquired
TestExecCommand - 2019/12/06 06:33:55.384025 [INFO] consul: cluster leadership acquired
TestExecCommand_NoShell - 2019/12/06 06:33:55.384359 [INFO] consul: New leader elected: Node 7f32a16c-0c8d-86a1-e208-2f57b5331a76
TestExecCommand - 2019/12/06 06:33:55.384387 [INFO] consul: New leader elected: Node c271a25d-d9ba-eea8-09b6-66284ff562fd
TestExecCommand_NoShell - 2019/12/06 06:33:55.659042 [INFO] agent: Synced node info
TestExecCommand - 2019/12/06 06:33:55.748486 [INFO] agent: Synced node info
TestExecCommand_StreamResults - 2019/12/06 06:33:56.066368 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:44508 (tcp)
TestExecCommand_StreamResults - 2019/12/06 06:33:56.066450 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_StreamResults - 2019/12/06 06:33:56.066487 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_StreamResults (8.00s)
TestExecCommand_NoShell - 2019/12/06 06:33:56.410640 [DEBUG] agent: Node info in sync
TestExecCommand_NoShell - 2019/12/06 06:33:56.410787 [DEBUG] agent: Node info in sync
TestExecCommand_StreamResults - 2019/12/06 06:33:56.754718 [DEBUG] http: Request GET /v1/kv/_rexec/4acd5308-aa77-5a36-5f13-2944022284dc/?index=16&keys=&wait=2000ms (2.017268879s) from=127.0.0.1:49952
TestExecCommand - 2019/12/06 06:33:56.807771 [DEBUG] agent: Node info in sync
TestExecCommand - 2019/12/06 06:33:56.807918 [DEBUG] agent: Node info in sync
TestExecCommand_NoShell - 2019/12/06 06:33:57.090050 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_NoShell - 2019/12/06 06:33:57.090564 [DEBUG] consul: Skipping self join check for "Node 7f32a16c-0c8d-86a1-e208-2f57b5331a76" since the cluster is too small
TestExecCommand_NoShell - 2019/12/06 06:33:57.090727 [INFO] consul: member 'Node 7f32a16c-0c8d-86a1-e208-2f57b5331a76' joined, marking health alive
TestExecCommand_NoShell - 2019/12/06 06:33:57.364635 [DEBUG] http: Request GET /v1/agent/self (7.943852ms) from=127.0.0.1:59246
TestExecCommand - 2019/12/06 06:33:57.390264 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand - 2019/12/06 06:33:57.390790 [DEBUG] consul: Skipping self join check for "Node c271a25d-d9ba-eea8-09b6-66284ff562fd" since the cluster is too small
TestExecCommand - 2019/12/06 06:33:57.390978 [INFO] consul: member 'Node c271a25d-d9ba-eea8-09b6-66284ff562fd' joined, marking health alive
TestExecCommand_NoShell - 2019/12/06 06:33:57.632826 [DEBUG] http: Request PUT /v1/session/create (256.7263ms) from=127.0.0.1:59246
TestExecCommand - 2019/12/06 06:33:57.714665 [DEBUG] http: Request GET /v1/agent/self (6.700489ms) from=127.0.0.1:47398
TestExecCommand_NoShell - 2019/12/06 06:33:57.924748 [DEBUG] http: Request PUT /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da/job?acquire=adeca0ee-27f8-9149-83ce-5c803c5d61da (289.461394ms) from=127.0.0.1:59246
TestExecCommand - 2019/12/06 06:33:57.974031 [DEBUG] http: Request PUT /v1/session/create (248.372439ms) from=127.0.0.1:47398
TestExecCommand_NoShell - 2019/12/06 06:33:57.974675 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand - 2019/12/06 06:33:58.123006 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand - 2019/12/06 06:33:58.123901 [DEBUG] http: Request PUT /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321/job?acquire=20f345c4-3eb1-3031-52ee-558f7d109321 (146.404069ms) from=127.0.0.1:47398
TestExecCommand_NoShell - 2019/12/06 06:33:58.127851 [DEBUG] consul: User event: _rexec
TestExecCommand_NoShell - 2019/12/06 06:33:58.128098 [DEBUG] agent: received remote exec event (ID: cd019872-1538-b469-cd5e-9021599b1ee0)
TestExecCommand_NoShell - 2019/12/06 06:33:58.128615 [DEBUG] http: Request PUT /v1/event/fire/_rexec (1.427699ms) from=127.0.0.1:59246
TestExecCommand_NoShell - 2019/12/06 06:33:58.133202 [DEBUG] http: Request GET /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da/?keys=&wait=1000ms (908.688µs) from=127.0.0.1:59246
TestExecCommand_NoShell - 2019/12/06 06:33:58.296695 [DEBUG] http: Request GET /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da/?index=12&keys=&wait=1000ms (160.712068ms) from=127.0.0.1:59246
TestExecCommand_NoShell - 2019/12/06 06:33:58.300182 [INFO] agent: remote exec ''
TestExecCommand - 2019/12/06 06:33:58.328451 [DEBUG] http: Request PUT /v1/event/fire/_rexec (933.355µs) from=127.0.0.1:47398
TestExecCommand - 2019/12/06 06:33:58.329591 [DEBUG] consul: User event: _rexec
TestExecCommand - 2019/12/06 06:33:58.329827 [DEBUG] agent: received remote exec event (ID: 749a0321-e304-953a-7391-3caf4b4239d6)
TestExecCommand - 2019/12/06 06:33:58.348465 [DEBUG] http: Request GET /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321/?keys=&wait=1000ms (3.800088ms) from=127.0.0.1:47398
TestExecCommand - 2019/12/06 06:33:58.615061 [INFO] agent: remote exec 'uptime'
TestExecCommand_NoShell - 2019/12/06 06:33:58.629713 [DEBUG] http: Request GET /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da/?index=13&keys=&wait=1000ms (330.130339ms) from=127.0.0.1:59246
TestExecCommand - 2019/12/06 06:33:58.630573 [DEBUG] http: Request GET /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321/?index=12&keys=&wait=1000ms (277.596451ms) from=127.0.0.1:47398
TestExecCommand_NoShell - 2019/12/06 06:33:58.637818 [DEBUG] http: Request GET /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da/Node%207f32a16c-0c8d-86a1-e208-2f57b5331a76/out/00000 (3.345411ms) from=127.0.0.1:59246
TestExecCommand_NoShell - 2019/12/06 06:33:58.841028 [DEBUG] http: Request GET /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da/?index=14&keys=&wait=1000ms (199.517636ms) from=127.0.0.1:59246
TestExecCommand_NoShell - 2019/12/06 06:33:58.844750 [DEBUG] http: Request GET /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da/Node%207f32a16c-0c8d-86a1-e208-2f57b5331a76/exit (815.685µs) from=127.0.0.1:59246
TestExecCommand - 2019/12/06 06:33:58.965827 [DEBUG] http: Request GET /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321/?index=13&keys=&wait=1000ms (319.574427ms) from=127.0.0.1:47398
TestExecCommand - 2019/12/06 06:33:58.970266 [DEBUG] http: Request GET /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321/Node%20c271a25d-d9ba-eea8-09b6-66284ff562fd/out/00000 (1.473368ms) from=127.0.0.1:47398
TestExecCommand - 2019/12/06 06:33:59.158750 [DEBUG] http: Request GET /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321/?index=14&keys=&wait=1000ms (185.768317ms) from=127.0.0.1:47398
TestExecCommand - 2019/12/06 06:33:59.163178 [DEBUG] http: Request GET /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321/Node%20c271a25d-d9ba-eea8-09b6-66284ff562fd/exit (758.018µs) from=127.0.0.1:47398
TestExecCommand_NoShell - 2019/12/06 06:33:59.900033 [DEBUG] http: Request GET /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da/?index=15&keys=&wait=1000ms (1.052687129s) from=127.0.0.1:59246
TestExecCommand - 2019/12/06 06:34:00.214565 [DEBUG] http: Request GET /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321/?index=15&keys=&wait=1000ms (1.047251003s) from=127.0.0.1:47398
TestExecCommand_NoShell - 2019/12/06 06:34:01.226041 [DEBUG] http: Request PUT /v1/session/destroy/adeca0ee-27f8-9149-83ce-5c803c5d61da (1.37703s) from=127.0.0.1:59254
TestExecCommand - 2019/12/06 06:34:01.317227 [DEBUG] http: Request PUT /v1/session/destroy/20f345c4-3eb1-3031-52ee-558f7d109321 (1.148985035s) from=127.0.0.1:47406
TestExecCommand_NoShell - 2019/12/06 06:34:01.556960 [DEBUG] http: Request DELETE /v1/kv/_rexec/adeca0ee-27f8-9149-83ce-5c803c5d61da?recurse= (328.330296ms) from=127.0.0.1:59254
TestExecCommand - 2019/12/06 06:34:01.640768 [DEBUG] http: Request DELETE /v1/kv/_rexec/20f345c4-3eb1-3031-52ee-558f7d109321?recurse= (321.114796ms) from=127.0.0.1:47406
TestExecCommand_NoShell - 2019/12/06 06:34:01.882228 [DEBUG] http: Request PUT /v1/session/destroy/adeca0ee-27f8-9149-83ce-5c803c5d61da (323.065174ms) from=127.0.0.1:59254
TestExecCommand_NoShell - 2019/12/06 06:34:01.883493 [INFO] agent: Requesting shutdown
TestExecCommand_NoShell - 2019/12/06 06:34:01.883573 [INFO] consul: shutting down server
TestExecCommand_NoShell - 2019/12/06 06:34:01.883617 [WARN] serf: Shutdown without a Leave
TestExecCommand - 2019/12/06 06:34:01.948752 [DEBUG] http: Request PUT /v1/session/destroy/20f345c4-3eb1-3031-52ee-558f7d109321 (305.747438ms) from=127.0.0.1:47406
TestExecCommand - 2019/12/06 06:34:01.950056 [INFO] agent: Requesting shutdown
TestExecCommand - 2019/12/06 06:34:01.950156 [INFO] consul: shutting down server
TestExecCommand - 2019/12/06 06:34:01.950218 [WARN] serf: Shutdown without a Leave
TestExecCommand_NoShell - 2019/12/06 06:34:02.005962 [WARN] serf: Shutdown without a Leave
TestExecCommand - 2019/12/06 06:34:02.058068 [WARN] serf: Shutdown without a Leave
TestExecCommand_NoShell - 2019/12/06 06:34:02.114476 [INFO] manager: shutting down
TestExecCommand_NoShell - 2019/12/06 06:34:02.115409 [INFO] agent: consul server down
TestExecCommand_NoShell - 2019/12/06 06:34:02.115478 [INFO] agent: shutdown complete
TestExecCommand_NoShell - 2019/12/06 06:34:02.115539 [INFO] agent: Stopping DNS server 127.0.0.1:44525 (tcp)
TestExecCommand_NoShell - 2019/12/06 06:34:02.115698 [INFO] agent: Stopping DNS server 127.0.0.1:44525 (udp)
TestExecCommand_NoShell - 2019/12/06 06:34:02.115894 [INFO] agent: Stopping HTTP server 127.0.0.1:44526 (tcp)
TestExecCommand_NoShell - 2019/12/06 06:34:02.116560 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_NoShell - 2019/12/06 06:34:02.116806 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_NoShell (8.60s)
TestExecCommand - 2019/12/06 06:34:02.173214 [INFO] manager: shutting down
TestExecCommand - 2019/12/06 06:34:02.173879 [INFO] agent: consul server down
TestExecCommand - 2019/12/06 06:34:02.173940 [INFO] agent: shutdown complete
TestExecCommand - 2019/12/06 06:34:02.174001 [INFO] agent: Stopping DNS server 127.0.0.1:44531 (tcp)
TestExecCommand - 2019/12/06 06:34:02.174214 [INFO] agent: Stopping DNS server 127.0.0.1:44531 (udp)
TestExecCommand - 2019/12/06 06:34:02.174401 [INFO] agent: Stopping HTTP server 127.0.0.1:44532 (tcp)
TestExecCommand - 2019/12/06 06:34:02.174972 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand - 2019/12/06 06:34:02.175065 [INFO] agent: Endpoints down
--- PASS: TestExecCommand (8.58s)
PASS
ok  	github.com/hashicorp/consul/command/exec	14.657s
=== RUN   TestConfigUtil_Values
=== PAUSE TestConfigUtil_Values
=== RUN   TestConfigUtil_Visit
=== PAUSE TestConfigUtil_Visit
=== RUN   TestFlagMapValueSet
=== PAUSE TestFlagMapValueSet
=== RUN   TestAppendSliceValue_implements
=== PAUSE TestAppendSliceValue_implements
=== RUN   TestAppendSliceValueSet
=== PAUSE TestAppendSliceValueSet
=== RUN   TestHTTPFlagsSetToken
--- PASS: TestHTTPFlagsSetToken (0.00s)
=== CONT  TestAppendSliceValue_implements
=== CONT  TestConfigUtil_Values
--- PASS: TestAppendSliceValue_implements (0.00s)
=== CONT  TestAppendSliceValueSet
--- PASS: TestAppendSliceValueSet (0.00s)
=== CONT  TestFlagMapValueSet
=== RUN   TestFlagMapValueSet/missing_=
=== RUN   TestFlagMapValueSet/sets
=== RUN   TestFlagMapValueSet/sets_multiple
=== RUN   TestFlagMapValueSet/overwrites
--- PASS: TestConfigUtil_Values (0.00s)
=== CONT  TestConfigUtil_Visit
--- PASS: TestFlagMapValueSet (0.00s)
    --- PASS: TestFlagMapValueSet/missing_= (0.00s)
    --- PASS: TestFlagMapValueSet/sets (0.00s)
    --- PASS: TestFlagMapValueSet/sets_multiple (0.00s)
    --- PASS: TestFlagMapValueSet/overwrites (0.00s)
--- PASS: TestConfigUtil_Visit (0.17s)
PASS
ok  	github.com/hashicorp/consul/command/flags	0.343s
=== RUN   TestForceLeaveCommand_noTabs
=== PAUSE TestForceLeaveCommand_noTabs
=== RUN   TestForceLeaveCommand
=== PAUSE TestForceLeaveCommand
=== RUN   TestForceLeaveCommand_noAddrs
=== PAUSE TestForceLeaveCommand_noAddrs
=== CONT  TestForceLeaveCommand_noTabs
--- PASS: TestForceLeaveCommand_noTabs (0.00s)
=== CONT  TestForceLeaveCommand_noAddrs
=== CONT  TestForceLeaveCommand
--- PASS: TestForceLeaveCommand_noAddrs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestForceLeaveCommand - 2019/12/06 06:34:25.228603 [WARN] agent: Node name "Node 131d0dab-eee8-1878-7134-2d70657376ad" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestForceLeaveCommand - 2019/12/06 06:34:25.229649 [DEBUG] tlsutil: Update with version 1
TestForceLeaveCommand - 2019/12/06 06:34:25.237123 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:34:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:131d0dab-eee8-1878-7134-2d70657376ad Address:127.0.0.1:50506}]
2019/12/06 06:34:26 [INFO]  raft: Node at 127.0.0.1:50506 [Follower] entering Follower state (Leader: "")
TestForceLeaveCommand - 2019/12/06 06:34:26.022640 [INFO] serf: EventMemberJoin: Node 131d0dab-eee8-1878-7134-2d70657376ad.dc1 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:26.037348 [INFO] serf: EventMemberJoin: Node 131d0dab-eee8-1878-7134-2d70657376ad 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:26.040095 [INFO] consul: Handled member-join event for server "Node 131d0dab-eee8-1878-7134-2d70657376ad.dc1" in area "wan"
TestForceLeaveCommand - 2019/12/06 06:34:26.040431 [INFO] agent: Started DNS server 127.0.0.1:50501 (udp)
TestForceLeaveCommand - 2019/12/06 06:34:26.041595 [INFO] agent: Started DNS server 127.0.0.1:50501 (tcp)
TestForceLeaveCommand - 2019/12/06 06:34:26.041881 [INFO] consul: Adding LAN server Node 131d0dab-eee8-1878-7134-2d70657376ad (Addr: tcp/127.0.0.1:50506) (DC: dc1)
TestForceLeaveCommand - 2019/12/06 06:34:26.046702 [INFO] agent: Started HTTP server on 127.0.0.1:50502 (tcp)
TestForceLeaveCommand - 2019/12/06 06:34:26.047042 [INFO] agent: started state syncer
2019/12/06 06:34:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:26 [INFO]  raft: Node at 127.0.0.1:50506 [Candidate] entering Candidate state in term 2
2019/12/06 06:34:26 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:26 [INFO]  raft: Node at 127.0.0.1:50506 [Leader] entering Leader state
TestForceLeaveCommand - 2019/12/06 06:34:26.991288 [INFO] consul: cluster leadership acquired
TestForceLeaveCommand - 2019/12/06 06:34:26.991946 [INFO] consul: New leader elected: Node 131d0dab-eee8-1878-7134-2d70657376ad
WARNING: bootstrap = true: do not enable unless necessary
TestForceLeaveCommand - 2019/12/06 06:34:27.236560 [WARN] agent: Node name "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestForceLeaveCommand - 2019/12/06 06:34:27.238335 [DEBUG] tlsutil: Update with version 1
TestForceLeaveCommand - 2019/12/06 06:34:27.242742 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/12/06 06:34:27.357412 [INFO] agent: Synced node info
TestForceLeaveCommand - 2019/12/06 06:34:28.498507 [DEBUG] agent: Node info in sync
TestForceLeaveCommand - 2019/12/06 06:34:28.498624 [DEBUG] agent: Node info in sync
2019/12/06 06:34:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a Address:127.0.0.1:50512}]
2019/12/06 06:34:28 [INFO]  raft: Node at 127.0.0.1:50512 [Follower] entering Follower state (Leader: "")
TestForceLeaveCommand - 2019/12/06 06:34:28.861200 [INFO] serf: EventMemberJoin: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a.dc1 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:28.873939 [INFO] serf: EventMemberJoin: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:28.877511 [INFO] agent: Started DNS server 127.0.0.1:50507 (udp)
TestForceLeaveCommand - 2019/12/06 06:34:28.880609 [INFO] agent: Started DNS server 127.0.0.1:50507 (tcp)
TestForceLeaveCommand - 2019/12/06 06:34:28.881565 [INFO] consul: Adding LAN server Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a (Addr: tcp/127.0.0.1:50512) (DC: dc1)
TestForceLeaveCommand - 2019/12/06 06:34:28.883093 [INFO] consul: Handled member-join event for server "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a.dc1" in area "wan"
TestForceLeaveCommand - 2019/12/06 06:34:28.895742 [INFO] agent: Started HTTP server on 127.0.0.1:50508 (tcp)
TestForceLeaveCommand - 2019/12/06 06:34:28.895860 [INFO] agent: started state syncer
2019/12/06 06:34:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:28 [INFO]  raft: Node at 127.0.0.1:50512 [Candidate] entering Candidate state in term 2
TestForceLeaveCommand - 2019/12/06 06:34:29.282348 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestForceLeaveCommand - 2019/12/06 06:34:29.282850 [DEBUG] consul: Skipping self join check for "Node 131d0dab-eee8-1878-7134-2d70657376ad" since the cluster is too small
TestForceLeaveCommand - 2019/12/06 06:34:29.283015 [INFO] consul: member 'Node 131d0dab-eee8-1878-7134-2d70657376ad' joined, marking health alive
TestForceLeaveCommand - 2019/12/06 06:34:29.665146 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:34:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:30 [INFO]  raft: Node at 127.0.0.1:50512 [Leader] entering Leader state
TestForceLeaveCommand - 2019/12/06 06:34:30.007434 [INFO] consul: cluster leadership acquired
TestForceLeaveCommand - 2019/12/06 06:34:30.007943 [INFO] consul: New leader elected: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a
TestForceLeaveCommand - 2019/12/06 06:34:30.047066 [INFO] agent: (LAN) joining: [127.0.0.1:50504]
TestForceLeaveCommand - 2019/12/06 06:34:30.047990 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:50504
TestForceLeaveCommand - 2019/12/06 06:34:30.048010 [DEBUG] memberlist: Stream connection from=127.0.0.1:46230
TestForceLeaveCommand - 2019/12/06 06:34:30.052811 [INFO] serf: EventMemberJoin: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:30.053361 [INFO] consul: Adding LAN server Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a (Addr: tcp/127.0.0.1:50512) (DC: dc1)
TestForceLeaveCommand - 2019/12/06 06:34:30.053468 [INFO] consul: New leader elected: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a
TestForceLeaveCommand - 2019/12/06 06:34:30.053810 [ERR] consul: 'Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a' and 'Node 131d0dab-eee8-1878-7134-2d70657376ad' are both in bootstrap mode. Only one node should be in bootstrap mode, not adding Raft peer.
TestForceLeaveCommand - 2019/12/06 06:34:30.053951 [INFO] consul: member 'Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a' joined, marking health alive
TestForceLeaveCommand - 2019/12/06 06:34:30.054653 [INFO] serf: EventMemberJoin: Node 131d0dab-eee8-1878-7134-2d70657376ad 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:30.054807 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:50511
TestForceLeaveCommand - 2019/12/06 06:34:30.055227 [INFO] agent: (LAN) joined: 1
TestForceLeaveCommand - 2019/12/06 06:34:30.055302 [DEBUG] agent: systemd notify failed: No socket
TestForceLeaveCommand - 2019/12/06 06:34:30.055349 [INFO] agent: Requesting shutdown
TestForceLeaveCommand - 2019/12/06 06:34:30.055366 [DEBUG] memberlist: Stream connection from=127.0.0.1:49892
TestForceLeaveCommand - 2019/12/06 06:34:30.055396 [INFO] consul: shutting down server
TestForceLeaveCommand - 2019/12/06 06:34:30.055575 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/12/06 06:34:30.055765 [INFO] consul: Adding LAN server Node 131d0dab-eee8-1878-7134-2d70657376ad (Addr: tcp/127.0.0.1:50506) (DC: dc1)
TestForceLeaveCommand - 2019/12/06 06:34:30.055932 [ERR] agent: failed to sync remote state: No cluster leader
TestForceLeaveCommand - 2019/12/06 06:34:30.060363 [INFO] serf: EventMemberJoin: Node 131d0dab-eee8-1878-7134-2d70657376ad.dc1 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:30.062928 [INFO] consul: Handled member-join event for server "Node 131d0dab-eee8-1878-7134-2d70657376ad.dc1" in area "wan"
TestForceLeaveCommand - 2019/12/06 06:34:30.060840 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:50505
TestForceLeaveCommand - 2019/12/06 06:34:30.061077 [DEBUG] memberlist: Stream connection from=127.0.0.1:51580
TestForceLeaveCommand - 2019/12/06 06:34:30.064143 [INFO] serf: EventMemberJoin: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a.dc1 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:30.065145 [DEBUG] consul: Successfully performed flood-join for "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a" at 127.0.0.1:50511
TestForceLeaveCommand - 2019/12/06 06:34:30.065456 [INFO] consul: Handled member-join event for server "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a.dc1" in area "wan"
TestForceLeaveCommand - 2019/12/06 06:34:30.072838 [DEBUG] consul: Successfully performed flood-join for "Node 131d0dab-eee8-1878-7134-2d70657376ad" at 127.0.0.1:50505
TestForceLeaveCommand - 2019/12/06 06:34:30.181449 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/12/06 06:34:30.281874 [INFO] manager: shutting down
TestForceLeaveCommand - 2019/12/06 06:34:30.282072 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestForceLeaveCommand - 2019/12/06 06:34:30.284598 [INFO] agent: consul server down
TestForceLeaveCommand - 2019/12/06 06:34:30.284683 [INFO] agent: shutdown complete
TestForceLeaveCommand - 2019/12/06 06:34:30.284749 [INFO] agent: Stopping DNS server 127.0.0.1:50507 (tcp)
TestForceLeaveCommand - 2019/12/06 06:34:30.284936 [INFO] agent: Stopping DNS server 127.0.0.1:50507 (udp)
TestForceLeaveCommand - 2019/12/06 06:34:30.285120 [INFO] agent: Stopping HTTP server 127.0.0.1:50508 (tcp)
TestForceLeaveCommand - 2019/12/06 06:34:30.285742 [INFO] agent: Waiting for endpoints to shut down
TestForceLeaveCommand - 2019/12/06 06:34:30.285828 [INFO] agent: Endpoints down
TestForceLeaveCommand - 2019/12/06 06:34:30.291518 [INFO] agent: Force leaving node: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a
TestForceLeaveCommand - 2019/12/06 06:34:31.479746 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestForceLeaveCommand - 2019/12/06 06:34:31.479837 [DEBUG] agent: Node info in sync
TestForceLeaveCommand - 2019/12/06 06:34:31.524458 [DEBUG] http: Request PUT /v1/agent/force-leave/Node%20a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a (1.232872651s) from=127.0.0.1:33130
TestForceLeaveCommand - 2019/12/06 06:34:31.541300 [DEBUG] memberlist: Failed ping: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a (timeout reached)
TestForceLeaveCommand - 2019/12/06 06:34:31.575702 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/12/06 06:34:31.577871 [WARN] consul: error getting server health from "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:50512: connect: connection refused
TestForceLeaveCommand - 2019/12/06 06:34:32.038108 [INFO] memberlist: Suspect Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a has failed, no acks received
TestForceLeaveCommand - 2019/12/06 06:34:32.575747 [WARN] consul: error getting server health from "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a": context deadline exceeded
TestForceLeaveCommand - 2019/12/06 06:34:33.538922 [DEBUG] memberlist: Failed ping: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a (timeout reached)
TestForceLeaveCommand - 2019/12/06 06:34:33.574548 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/12/06 06:34:33.575517 [WARN] consul: error getting server health from "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:50512: connect: connection refused
TestForceLeaveCommand - 2019/12/06 06:34:34.025986 [DEBUG] memberlist: Failed ping: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a.dc1 (timeout reached)
TestForceLeaveCommand - 2019/12/06 06:34:34.574714 [WARN] consul: error getting server health from "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a": context deadline exceeded
TestForceLeaveCommand - 2019/12/06 06:34:35.044574 [INFO] memberlist: Suspect Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a has failed, no acks received
TestForceLeaveCommand - 2019/12/06 06:34:35.545901 [DEBUG] memberlist: Failed ping: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a (timeout reached)
TestForceLeaveCommand - 2019/12/06 06:34:35.575020 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/12/06 06:34:35.575969 [WARN] consul: error getting server health from "Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:50512: connect: connection refused
TestForceLeaveCommand - 2019/12/06 06:34:36.024294 [INFO] memberlist: Suspect Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a.dc1 has failed, no acks received
TestForceLeaveCommand - 2019/12/06 06:34:36.038844 [INFO] memberlist: Marking Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a as failed, suspect timeout reached (0 peer confirmations)
TestForceLeaveCommand - 2019/12/06 06:34:36.039316 [INFO] serf: EventMemberLeave: Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a 127.0.0.1
TestForceLeaveCommand - 2019/12/06 06:34:36.039711 [INFO] consul: Removing LAN server Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a (Addr: tcp/127.0.0.1:50512) (DC: dc1)
TestForceLeaveCommand - 2019/12/06 06:34:36.040233 [INFO] consul: member 'Node a1eccfb5-58d2-ec1a-9e92-c203b67a5a6a' left, deregistering
TestForceLeaveCommand - 2019/12/06 06:34:36.049787 [INFO] agent: Requesting shutdown
TestForceLeaveCommand - 2019/12/06 06:34:36.049882 [INFO] consul: shutting down server
TestForceLeaveCommand - 2019/12/06 06:34:36.049922 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/12/06 06:34:36.206489 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/12/06 06:34:36.306547 [INFO] manager: shutting down
TestForceLeaveCommand - 2019/12/06 06:34:36.307322 [INFO] agent: consul server down
TestForceLeaveCommand - 2019/12/06 06:34:36.307386 [INFO] agent: shutdown complete
TestForceLeaveCommand - 2019/12/06 06:34:36.307448 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (tcp)
TestForceLeaveCommand - 2019/12/06 06:34:36.307611 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (udp)
TestForceLeaveCommand - 2019/12/06 06:34:36.307766 [INFO] agent: Stopping HTTP server 127.0.0.1:50502 (tcp)
TestForceLeaveCommand - 2019/12/06 06:34:36.308232 [INFO] agent: Waiting for endpoints to shut down
TestForceLeaveCommand - 2019/12/06 06:34:36.308431 [INFO] agent: Endpoints down
--- PASS: TestForceLeaveCommand (11.15s)
PASS
ok  	github.com/hashicorp/consul/command/forceleave	11.595s
?   	github.com/hashicorp/consul/command/helpers	[no test files]
=== RUN   TestInfoCommand_noTabs
=== PAUSE TestInfoCommand_noTabs
=== RUN   TestInfoCommand
=== PAUSE TestInfoCommand
=== CONT  TestInfoCommand_noTabs
=== CONT  TestInfoCommand
--- PASS: TestInfoCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestInfoCommand - 2019/12/06 06:34:28.708447 [WARN] agent: Node name "Node 3de33ffd-704c-f67b-4541-f046e833337a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestInfoCommand - 2019/12/06 06:34:28.709252 [DEBUG] tlsutil: Update with version 1
TestInfoCommand - 2019/12/06 06:34:28.716318 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:34:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3de33ffd-704c-f67b-4541-f046e833337a Address:127.0.0.1:23506}]
2019/12/06 06:34:30 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestInfoCommand - 2019/12/06 06:34:30.188582 [INFO] serf: EventMemberJoin: Node 3de33ffd-704c-f67b-4541-f046e833337a.dc1 127.0.0.1
TestInfoCommand - 2019/12/06 06:34:30.192372 [INFO] serf: EventMemberJoin: Node 3de33ffd-704c-f67b-4541-f046e833337a 127.0.0.1
TestInfoCommand - 2019/12/06 06:34:30.193473 [INFO] consul: Adding LAN server Node 3de33ffd-704c-f67b-4541-f046e833337a (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestInfoCommand - 2019/12/06 06:34:30.193973 [INFO] consul: Handled member-join event for server "Node 3de33ffd-704c-f67b-4541-f046e833337a.dc1" in area "wan"
TestInfoCommand - 2019/12/06 06:34:30.194502 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestInfoCommand - 2019/12/06 06:34:30.194582 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestInfoCommand - 2019/12/06 06:34:30.197567 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestInfoCommand - 2019/12/06 06:34:30.197745 [INFO] agent: started state syncer
2019/12/06 06:34:30 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:30 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/12/06 06:34:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:30 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestInfoCommand - 2019/12/06 06:34:30.941866 [INFO] consul: cluster leadership acquired
TestInfoCommand - 2019/12/06 06:34:30.942482 [INFO] consul: New leader elected: Node 3de33ffd-704c-f67b-4541-f046e833337a
TestInfoCommand - 2019/12/06 06:34:32.149504 [INFO] agent: Synced node info
TestInfoCommand - 2019/12/06 06:34:32.158230 [DEBUG] http: Request GET /v1/agent/self (975.842679ms) from=127.0.0.1:55290
TestInfoCommand - 2019/12/06 06:34:32.174704 [INFO] agent: Requesting shutdown
TestInfoCommand - 2019/12/06 06:34:32.174847 [INFO] consul: shutting down server
TestInfoCommand - 2019/12/06 06:34:32.174945 [WARN] serf: Shutdown without a Leave
TestInfoCommand - 2019/12/06 06:34:32.289790 [WARN] serf: Shutdown without a Leave
TestInfoCommand - 2019/12/06 06:34:32.519788 [INFO] manager: shutting down
TestInfoCommand - 2019/12/06 06:34:32.711781 [ERR] agent: failed to sync remote state: No cluster leader
TestInfoCommand - 2019/12/06 06:34:33.014789 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestInfoCommand - 2019/12/06 06:34:33.014999 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestInfoCommand - 2019/12/06 06:34:33.015068 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestInfoCommand - 2019/12/06 06:34:33.015123 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestInfoCommand - 2019/12/06 06:34:33.015183 [ERR] consul: failed to transfer leadership in 3 attempts
TestInfoCommand - 2019/12/06 06:34:33.015029 [INFO] agent: consul server down
TestInfoCommand - 2019/12/06 06:34:33.015295 [INFO] agent: shutdown complete
TestInfoCommand - 2019/12/06 06:34:33.015344 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestInfoCommand - 2019/12/06 06:34:33.015505 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestInfoCommand - 2019/12/06 06:34:33.015695 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestInfoCommand - 2019/12/06 06:34:33.016246 [INFO] agent: Waiting for endpoints to shut down
TestInfoCommand - 2019/12/06 06:34:33.016330 [INFO] agent: Endpoints down
--- PASS: TestInfoCommand (4.38s)
PASS
ok  	github.com/hashicorp/consul/command/info	4.842s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== CONT  TestCommand_noTabs
--- PASS: TestCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/intention	0.066s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== CONT  TestCommand_noTabs
=== CONT  TestCommand
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/3_args
=== RUN   TestCommand_Validation/0_args
--- PASS: TestCommand_noTabs (0.01s)
=== RUN   TestCommand_Validation/1_args
--- PASS: TestCommand_Validation (0.02s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/1_args (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/12/06 06:34:38.969617 [WARN] agent: Node name "Node 22af8617-3974-7067-a8a6-b6dcb96f53c3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/12/06 06:34:38.971020 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/12/06 06:34:38.981979 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:34:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:22af8617-3974-7067-a8a6-b6dcb96f53c3 Address:127.0.0.1:28006}]
2019/12/06 06:34:40 [INFO]  raft: Node at 127.0.0.1:28006 [Follower] entering Follower state (Leader: "")
TestCommand - 2019/12/06 06:34:40.061573 [INFO] serf: EventMemberJoin: Node 22af8617-3974-7067-a8a6-b6dcb96f53c3.dc1 127.0.0.1
TestCommand - 2019/12/06 06:34:40.065356 [INFO] serf: EventMemberJoin: Node 22af8617-3974-7067-a8a6-b6dcb96f53c3 127.0.0.1
TestCommand - 2019/12/06 06:34:40.066926 [INFO] consul: Adding LAN server Node 22af8617-3974-7067-a8a6-b6dcb96f53c3 (Addr: tcp/127.0.0.1:28006) (DC: dc1)
TestCommand - 2019/12/06 06:34:40.067223 [INFO] consul: Handled member-join event for server "Node 22af8617-3974-7067-a8a6-b6dcb96f53c3.dc1" in area "wan"
TestCommand - 2019/12/06 06:34:40.075420 [INFO] agent: Started DNS server 127.0.0.1:28001 (tcp)
TestCommand - 2019/12/06 06:34:40.075935 [INFO] agent: Started DNS server 127.0.0.1:28001 (udp)
TestCommand - 2019/12/06 06:34:40.078863 [INFO] agent: Started HTTP server on 127.0.0.1:28002 (tcp)
TestCommand - 2019/12/06 06:34:40.079034 [INFO] agent: started state syncer
2019/12/06 06:34:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:40 [INFO]  raft: Node at 127.0.0.1:28006 [Candidate] entering Candidate state in term 2
2019/12/06 06:34:40 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:40 [INFO]  raft: Node at 127.0.0.1:28006 [Leader] entering Leader state
TestCommand - 2019/12/06 06:34:40.598748 [INFO] consul: cluster leadership acquired
TestCommand - 2019/12/06 06:34:40.599499 [INFO] consul: New leader elected: Node 22af8617-3974-7067-a8a6-b6dcb96f53c3
TestCommand - 2019/12/06 06:34:40.924620 [INFO] agent: Synced node info
TestCommand - 2019/12/06 06:34:41.773615 [DEBUG] http: Request POST /v1/connect/intentions (914.432251ms) from=127.0.0.1:58038
TestCommand - 2019/12/06 06:34:41.801392 [DEBUG] http: Request GET /v1/connect/intentions/check?destination=db&source=foo&source-type=consul (1.468034ms) from=127.0.0.1:58040
TestCommand - 2019/12/06 06:34:41.811054 [DEBUG] http: Request GET /v1/connect/intentions/check?destination=db&source=web&source-type=consul (1.604704ms) from=127.0.0.1:58042
TestCommand - 2019/12/06 06:34:41.813140 [INFO] agent: Requesting shutdown
TestCommand - 2019/12/06 06:34:41.813235 [INFO] consul: shutting down server
TestCommand - 2019/12/06 06:34:41.813351 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/06 06:34:42.149755 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/06 06:34:42.281762 [INFO] manager: shutting down
TestCommand - 2019/12/06 06:34:42.331771 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand - 2019/12/06 06:34:42.332022 [INFO] agent: consul server down
TestCommand - 2019/12/06 06:34:42.332078 [INFO] agent: shutdown complete
TestCommand - 2019/12/06 06:34:42.332138 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (tcp)
TestCommand - 2019/12/06 06:34:42.332285 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (udp)
TestCommand - 2019/12/06 06:34:42.332447 [INFO] agent: Stopping HTTP server 127.0.0.1:28002 (tcp)
TestCommand - 2019/12/06 06:34:42.333414 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/12/06 06:34:42.333510 [INFO] agent: Endpoints down
--- PASS: TestCommand (3.50s)
PASS
ok  	github.com/hashicorp/consul/command/intention/check	3.834s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== RUN   TestCommand_deny
=== PAUSE TestCommand_deny
=== RUN   TestCommand_meta
=== PAUSE TestCommand_meta
=== RUN   TestCommand_File
=== PAUSE TestCommand_File
=== RUN   TestCommand_FileNoExist
=== PAUSE TestCommand_FileNoExist
=== RUN   TestCommand_replace
=== PAUSE TestCommand_replace
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_File
=== CONT  TestCommand_meta
=== CONT  TestCommand_replace
--- PASS: TestCommand_noTabs (0.00s)
=== CONT  TestCommand_FileNoExist
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_FileNoExist - 2019/12/06 06:34:50.696955 [WARN] agent: Node name "Node 42d4db25-e5ff-0aee-5c88-da34271c899a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File - 2019/12/06 06:34:50.703851 [WARN] agent: Node name "Node 00f6826c-15f1-7b77-0062-7e5000331b8b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File - 2019/12/06 06:34:50.704343 [DEBUG] tlsutil: Update with version 1
TestCommand_FileNoExist - 2019/12/06 06:34:50.709665 [DEBUG] tlsutil: Update with version 1
TestCommand_FileNoExist - 2019/12/06 06:34:50.713303 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File - 2019/12/06 06:34:50.718839 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_replace - 2019/12/06 06:34:50.720808 [WARN] agent: Node name "Node c0428e52-887d-ab5d-9af2-98c3c740ef1f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_replace - 2019/12/06 06:34:50.721191 [DEBUG] tlsutil: Update with version 1
TestCommand_replace - 2019/12/06 06:34:50.731509 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_meta - 2019/12/06 06:34:50.774449 [WARN] agent: Node name "Node ba8b1201-4d77-73f1-4160-cdf4144b344b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_meta - 2019/12/06 06:34:50.775111 [DEBUG] tlsutil: Update with version 1
TestCommand_meta - 2019/12/06 06:34:50.780133 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:34:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:42d4db25-e5ff-0aee-5c88-da34271c899a Address:127.0.0.1:17524}]
2019/12/06 06:34:51 [INFO]  raft: Node at 127.0.0.1:17524 [Follower] entering Follower state (Leader: "")
TestCommand_FileNoExist - 2019/12/06 06:34:51.645489 [INFO] serf: EventMemberJoin: Node 42d4db25-e5ff-0aee-5c88-da34271c899a.dc1 127.0.0.1
TestCommand_FileNoExist - 2019/12/06 06:34:51.649939 [INFO] serf: EventMemberJoin: Node 42d4db25-e5ff-0aee-5c88-da34271c899a 127.0.0.1
TestCommand_FileNoExist - 2019/12/06 06:34:51.651139 [INFO] consul: Handled member-join event for server "Node 42d4db25-e5ff-0aee-5c88-da34271c899a.dc1" in area "wan"
TestCommand_FileNoExist - 2019/12/06 06:34:51.651495 [INFO] consul: Adding LAN server Node 42d4db25-e5ff-0aee-5c88-da34271c899a (Addr: tcp/127.0.0.1:17524) (DC: dc1)
TestCommand_FileNoExist - 2019/12/06 06:34:51.652080 [INFO] agent: Started DNS server 127.0.0.1:17519 (tcp)
TestCommand_FileNoExist - 2019/12/06 06:34:51.652622 [INFO] agent: Started DNS server 127.0.0.1:17519 (udp)
TestCommand_FileNoExist - 2019/12/06 06:34:51.657581 [INFO] agent: Started HTTP server on 127.0.0.1:17520 (tcp)
TestCommand_FileNoExist - 2019/12/06 06:34:51.657798 [INFO] agent: started state syncer
2019/12/06 06:34:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:51 [INFO]  raft: Node at 127.0.0.1:17524 [Candidate] entering Candidate state in term 2
2019/12/06 06:34:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:00f6826c-15f1-7b77-0062-7e5000331b8b Address:127.0.0.1:17512}]
2019/12/06 06:34:51 [INFO]  raft: Node at 127.0.0.1:17512 [Follower] entering Follower state (Leader: "")
TestCommand_File - 2019/12/06 06:34:51.937661 [INFO] serf: EventMemberJoin: Node 00f6826c-15f1-7b77-0062-7e5000331b8b.dc1 127.0.0.1
TestCommand_File - 2019/12/06 06:34:51.943231 [INFO] serf: EventMemberJoin: Node 00f6826c-15f1-7b77-0062-7e5000331b8b 127.0.0.1
TestCommand_File - 2019/12/06 06:34:51.945301 [INFO] consul: Adding LAN server Node 00f6826c-15f1-7b77-0062-7e5000331b8b (Addr: tcp/127.0.0.1:17512) (DC: dc1)
TestCommand_File - 2019/12/06 06:34:51.945695 [INFO] consul: Handled member-join event for server "Node 00f6826c-15f1-7b77-0062-7e5000331b8b.dc1" in area "wan"
TestCommand_File - 2019/12/06 06:34:51.947945 [INFO] agent: Started DNS server 127.0.0.1:17507 (tcp)
TestCommand_File - 2019/12/06 06:34:51.948161 [INFO] agent: Started DNS server 127.0.0.1:17507 (udp)
TestCommand_File - 2019/12/06 06:34:51.952923 [INFO] agent: Started HTTP server on 127.0.0.1:17508 (tcp)
TestCommand_File - 2019/12/06 06:34:51.953079 [INFO] agent: started state syncer
2019/12/06 06:34:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:51 [INFO]  raft: Node at 127.0.0.1:17512 [Candidate] entering Candidate state in term 2
2019/12/06 06:34:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ba8b1201-4d77-73f1-4160-cdf4144b344b Address:127.0.0.1:17518}]
2019/12/06 06:34:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c0428e52-887d-ab5d-9af2-98c3c740ef1f Address:127.0.0.1:17506}]
2019/12/06 06:34:52 [INFO]  raft: Node at 127.0.0.1:17518 [Follower] entering Follower state (Leader: "")
2019/12/06 06:34:52 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestCommand_replace - 2019/12/06 06:34:52.084310 [INFO] serf: EventMemberJoin: Node c0428e52-887d-ab5d-9af2-98c3c740ef1f.dc1 127.0.0.1
TestCommand_meta - 2019/12/06 06:34:52.085615 [INFO] serf: EventMemberJoin: Node ba8b1201-4d77-73f1-4160-cdf4144b344b.dc1 127.0.0.1
TestCommand_meta - 2019/12/06 06:34:52.094069 [INFO] serf: EventMemberJoin: Node ba8b1201-4d77-73f1-4160-cdf4144b344b 127.0.0.1
TestCommand_meta - 2019/12/06 06:34:52.104037 [INFO] consul: Adding LAN server Node ba8b1201-4d77-73f1-4160-cdf4144b344b (Addr: tcp/127.0.0.1:17518) (DC: dc1)
TestCommand_meta - 2019/12/06 06:34:52.104506 [INFO] consul: Handled member-join event for server "Node ba8b1201-4d77-73f1-4160-cdf4144b344b.dc1" in area "wan"
TestCommand_replace - 2019/12/06 06:34:52.109583 [INFO] serf: EventMemberJoin: Node c0428e52-887d-ab5d-9af2-98c3c740ef1f 127.0.0.1
TestCommand_replace - 2019/12/06 06:34:52.116428 [INFO] consul: Adding LAN server Node c0428e52-887d-ab5d-9af2-98c3c740ef1f (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestCommand_replace - 2019/12/06 06:34:52.118825 [INFO] consul: Handled member-join event for server "Node c0428e52-887d-ab5d-9af2-98c3c740ef1f.dc1" in area "wan"
TestCommand_replace - 2019/12/06 06:34:52.119429 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestCommand_replace - 2019/12/06 06:34:52.119834 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestCommand_meta - 2019/12/06 06:34:52.120499 [INFO] agent: Started DNS server 127.0.0.1:17513 (tcp)
TestCommand_meta - 2019/12/06 06:34:52.120579 [INFO] agent: Started DNS server 127.0.0.1:17513 (udp)
2019/12/06 06:34:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:52 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
TestCommand_meta - 2019/12/06 06:34:52.128310 [INFO] agent: Started HTTP server on 127.0.0.1:17514 (tcp)
TestCommand_meta - 2019/12/06 06:34:52.128410 [INFO] agent: started state syncer
TestCommand_replace - 2019/12/06 06:34:52.129885 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestCommand_replace - 2019/12/06 06:34:52.130013 [INFO] agent: started state syncer
2019/12/06 06:34:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:52 [INFO]  raft: Node at 127.0.0.1:17518 [Candidate] entering Candidate state in term 2
2019/12/06 06:34:52 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:52 [INFO]  raft: Node at 127.0.0.1:17524 [Leader] entering Leader state
TestCommand_FileNoExist - 2019/12/06 06:34:52.408369 [INFO] consul: cluster leadership acquired
TestCommand_FileNoExist - 2019/12/06 06:34:52.408927 [INFO] consul: New leader elected: Node 42d4db25-e5ff-0aee-5c88-da34271c899a
TestCommand_FileNoExist - 2019/12/06 06:34:52.422505 [INFO] agent: Requesting shutdown
TestCommand_FileNoExist - 2019/12/06 06:34:52.422635 [INFO] consul: shutting down server
TestCommand_FileNoExist - 2019/12/06 06:34:52.422694 [WARN] serf: Shutdown without a Leave
TestCommand_FileNoExist - 2019/12/06 06:34:52.423107 [ERR] agent: failed to sync remote state: No cluster leader
TestCommand_FileNoExist - 2019/12/06 06:34:52.507615 [WARN] serf: Shutdown without a Leave
TestCommand_FileNoExist - 2019/12/06 06:34:52.590292 [INFO] manager: shutting down
2019/12/06 06:34:52 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:52 [INFO]  raft: Node at 127.0.0.1:17512 [Leader] entering Leader state
TestCommand_File - 2019/12/06 06:34:52.593411 [INFO] consul: cluster leadership acquired
TestCommand_File - 2019/12/06 06:34:52.593871 [INFO] consul: New leader elected: Node 00f6826c-15f1-7b77-0062-7e5000331b8b
2019/12/06 06:34:52 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:52 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestCommand_replace - 2019/12/06 06:34:52.695359 [INFO] consul: cluster leadership acquired
TestCommand_replace - 2019/12/06 06:34:52.695854 [INFO] consul: New leader elected: Node c0428e52-887d-ab5d-9af2-98c3c740ef1f
2019/12/06 06:34:52 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:52 [INFO]  raft: Node at 127.0.0.1:17518 [Leader] entering Leader state
TestCommand_FileNoExist - 2019/12/06 06:34:52.781948 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCommand_FileNoExist - 2019/12/06 06:34:52.782219 [INFO] agent: consul server down
TestCommand_meta - 2019/12/06 06:34:52.782269 [INFO] consul: cluster leadership acquired
TestCommand_FileNoExist - 2019/12/06 06:34:52.782272 [INFO] agent: shutdown complete
TestCommand_FileNoExist - 2019/12/06 06:34:52.782359 [INFO] agent: Stopping DNS server 127.0.0.1:17519 (tcp)
TestCommand_FileNoExist - 2019/12/06 06:34:52.782529 [INFO] agent: Stopping DNS server 127.0.0.1:17519 (udp)
TestCommand_meta - 2019/12/06 06:34:52.782692 [INFO] consul: New leader elected: Node ba8b1201-4d77-73f1-4160-cdf4144b344b
TestCommand_FileNoExist - 2019/12/06 06:34:52.782719 [INFO] agent: Stopping HTTP server 127.0.0.1:17520 (tcp)
TestCommand_FileNoExist - 2019/12/06 06:34:52.782949 [INFO] agent: Waiting for endpoints to shut down
TestCommand_FileNoExist - 2019/12/06 06:34:52.783033 [INFO] agent: Endpoints down
--- PASS: TestCommand_FileNoExist (2.27s)
=== CONT  TestCommand_deny
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_deny - 2019/12/06 06:34:52.896249 [WARN] agent: Node name "Node a8368153-0a96-eed9-905d-84ed6fb9d122" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_deny - 2019/12/06 06:34:52.896692 [DEBUG] tlsutil: Update with version 1
TestCommand_deny - 2019/12/06 06:34:52.899575 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File - 2019/12/06 06:34:52.983898 [INFO] agent: Synced node info
TestCommand_File - 2019/12/06 06:34:52.987362 [DEBUG] http: Request POST /v1/connect/intentions (300.986061ms) from=127.0.0.1:33936
TestCommand_File - 2019/12/06 06:34:52.994604 [DEBUG] http: Request GET /v1/connect/intentions (2.534392ms) from=127.0.0.1:33938
TestCommand_File - 2019/12/06 06:34:52.997147 [INFO] agent: Requesting shutdown
TestCommand_File - 2019/12/06 06:34:52.997250 [INFO] consul: shutting down server
TestCommand_File - 2019/12/06 06:34:52.997297 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/12/06 06:34:53.083162 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/12/06 06:34:53.173565 [INFO] manager: shutting down
TestCommand_File - 2019/12/06 06:34:53.177414 [INFO] agent: consul server down
TestCommand_File - 2019/12/06 06:34:53.177495 [INFO] agent: shutdown complete
TestCommand_File - 2019/12/06 06:34:53.177550 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (tcp)
TestCommand_File - 2019/12/06 06:34:53.177684 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (udp)
TestCommand_File - 2019/12/06 06:34:53.177833 [INFO] agent: Stopping HTTP server 127.0.0.1:17508 (tcp)
TestCommand_File - 2019/12/06 06:34:53.178538 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File - 2019/12/06 06:34:53.178634 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCommand_File - 2019/12/06 06:34:53.179092 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommand_File - 2019/12/06 06:34:53.179360 [INFO] agent: Endpoints down
--- PASS: TestCommand_File (2.68s)
=== CONT  TestCommand
TestCommand_replace - 2019/12/06 06:34:53.260467 [INFO] agent: Synced node info
TestCommand_replace - 2019/12/06 06:34:53.264112 [DEBUG] agent: Node info in sync
TestCommand_replace - 2019/12/06 06:34:53.270210 [DEBUG] http: Request POST /v1/connect/intentions (232.784794ms) from=127.0.0.1:43418
TestCommand_replace - 2019/12/06 06:34:53.276380 [DEBUG] http: Request GET /v1/connect/intentions (1.104692ms) from=127.0.0.1:43422
TestCommand_meta - 2019/12/06 06:34:53.342652 [DEBUG] http: Request POST /v1/connect/intentions (272.164718ms) from=127.0.0.1:59518
TestCommand_meta - 2019/12/06 06:34:53.348063 [DEBUG] http: Request GET /v1/connect/intentions (1.318697ms) from=127.0.0.1:59524
TestCommand_meta - 2019/12/06 06:34:53.351683 [INFO] agent: Synced node info
TestCommand_meta - 2019/12/06 06:34:53.356498 [INFO] agent: Requesting shutdown
TestCommand_meta - 2019/12/06 06:34:53.356603 [INFO] consul: shutting down server
TestCommand_meta - 2019/12/06 06:34:53.356647 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/12/06 06:34:53.373695 [WARN] agent: Node name "Node d9fdf5c7-5a8c-c516-81b7-4c854ab5e151" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/12/06 06:34:53.374591 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/12/06 06:34:53.377260 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_meta - 2019/12/06 06:34:53.449695 [WARN] serf: Shutdown without a Leave
TestCommand_meta - 2019/12/06 06:34:53.529426 [INFO] manager: shutting down
TestCommand_replace - 2019/12/06 06:34:53.869059 [ERR] http: Request POST /v1/connect/intentions, error: duplicate intention found: ALLOW default/foo => default/bar (ID: dc1ade00-fbdb-ff30-7ce2-f0815f38e995, Precedence: 9) from=127.0.0.1:43424
TestCommand_replace - 2019/12/06 06:34:53.870303 [DEBUG] http: Request POST /v1/connect/intentions (584.41371ms) from=127.0.0.1:43424
TestCommand_replace - 2019/12/06 06:34:53.880731 [DEBUG] http: Request GET /v1/connect/intentions (1.754708ms) from=127.0.0.1:43428
TestCommand_meta - 2019/12/06 06:34:54.015433 [INFO] agent: consul server down
TestCommand_meta - 2019/12/06 06:34:54.015540 [INFO] agent: shutdown complete
TestCommand_meta - 2019/12/06 06:34:54.015605 [INFO] agent: Stopping DNS server 127.0.0.1:17513 (tcp)
TestCommand_meta - 2019/12/06 06:34:54.015770 [INFO] agent: Stopping DNS server 127.0.0.1:17513 (udp)
TestCommand_meta - 2019/12/06 06:34:54.015940 [INFO] agent: Stopping HTTP server 127.0.0.1:17514 (tcp)
TestCommand_meta - 2019/12/06 06:34:54.016707 [INFO] agent: Waiting for endpoints to shut down
TestCommand_meta - 2019/12/06 06:34:54.016815 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_meta - 2019/12/06 06:34:54.017000 [INFO] agent: Endpoints down
--- PASS: TestCommand_meta (3.52s)
=== CONT  TestCommand_Validation
TestCommand_meta - 2019/12/06 06:34:54.021600 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCommand_meta - 2019/12/06 06:34:54.021697 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
=== RUN   TestCommand_Validation/-allow_and_-deny
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/-allow_and_-deny (0.00s)
2019/12/06 06:34:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a8368153-0a96-eed9-905d-84ed6fb9d122 Address:127.0.0.1:17530}]
2019/12/06 06:34:54 [INFO]  raft: Node at 127.0.0.1:17530 [Follower] entering Follower state (Leader: "")
TestCommand_deny - 2019/12/06 06:34:54.238772 [INFO] serf: EventMemberJoin: Node a8368153-0a96-eed9-905d-84ed6fb9d122.dc1 127.0.0.1
TestCommand_deny - 2019/12/06 06:34:54.268611 [INFO] serf: EventMemberJoin: Node a8368153-0a96-eed9-905d-84ed6fb9d122 127.0.0.1
TestCommand_deny - 2019/12/06 06:34:54.278843 [INFO] consul: Adding LAN server Node a8368153-0a96-eed9-905d-84ed6fb9d122 (Addr: tcp/127.0.0.1:17530) (DC: dc1)
TestCommand_deny - 2019/12/06 06:34:54.283199 [INFO] consul: Handled member-join event for server "Node a8368153-0a96-eed9-905d-84ed6fb9d122.dc1" in area "wan"
TestCommand_deny - 2019/12/06 06:34:54.286272 [INFO] agent: Started DNS server 127.0.0.1:17525 (udp)
TestCommand_deny - 2019/12/06 06:34:54.288273 [INFO] agent: Started DNS server 127.0.0.1:17525 (tcp)
2019/12/06 06:34:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:54 [INFO]  raft: Node at 127.0.0.1:17530 [Candidate] entering Candidate state in term 2
TestCommand_deny - 2019/12/06 06:34:54.299269 [INFO] agent: Started HTTP server on 127.0.0.1:17526 (tcp)
TestCommand_deny - 2019/12/06 06:34:54.299612 [INFO] agent: started state syncer
TestCommand_replace - 2019/12/06 06:34:54.311599 [DEBUG] http: Request PUT /v1/connect/intentions/dc1ade00-fbdb-ff30-7ce2-f0815f38e995 (421.140546ms) from=127.0.0.1:43428
TestCommand_replace - 2019/12/06 06:34:54.315093 [DEBUG] http: Request GET /v1/connect/intentions (1.069358ms) from=127.0.0.1:43422
TestCommand_replace - 2019/12/06 06:34:54.317887 [INFO] agent: Requesting shutdown
TestCommand_replace - 2019/12/06 06:34:54.317974 [INFO] consul: shutting down server
TestCommand_replace - 2019/12/06 06:34:54.318021 [WARN] serf: Shutdown without a Leave
TestCommand_replace - 2019/12/06 06:34:54.490639 [WARN] serf: Shutdown without a Leave
TestCommand_replace - 2019/12/06 06:34:54.599398 [INFO] manager: shutting down
TestCommand_replace - 2019/12/06 06:34:54.599852 [INFO] agent: consul server down
TestCommand_replace - 2019/12/06 06:34:54.599907 [INFO] agent: shutdown complete
TestCommand_replace - 2019/12/06 06:34:54.599960 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestCommand_replace - 2019/12/06 06:34:54.600099 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestCommand_replace - 2019/12/06 06:34:54.600239 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestCommand_replace - 2019/12/06 06:34:54.601080 [INFO] agent: Waiting for endpoints to shut down
TestCommand_replace - 2019/12/06 06:34:54.601204 [INFO] agent: Endpoints down
--- PASS: TestCommand_replace (4.10s)
TestCommand_replace - 2019/12/06 06:34:54.628185 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
2019/12/06 06:34:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d9fdf5c7-5a8c-c516-81b7-4c854ab5e151 Address:127.0.0.1:17536}]
2019/12/06 06:34:54 [INFO]  raft: Node at 127.0.0.1:17536 [Follower] entering Follower state (Leader: "")
TestCommand - 2019/12/06 06:34:54.677769 [INFO] serf: EventMemberJoin: Node d9fdf5c7-5a8c-c516-81b7-4c854ab5e151.dc1 127.0.0.1
TestCommand - 2019/12/06 06:34:54.681516 [INFO] serf: EventMemberJoin: Node d9fdf5c7-5a8c-c516-81b7-4c854ab5e151 127.0.0.1
TestCommand - 2019/12/06 06:34:54.682345 [INFO] consul: Adding LAN server Node d9fdf5c7-5a8c-c516-81b7-4c854ab5e151 (Addr: tcp/127.0.0.1:17536) (DC: dc1)
TestCommand - 2019/12/06 06:34:54.682572 [INFO] consul: Handled member-join event for server "Node d9fdf5c7-5a8c-c516-81b7-4c854ab5e151.dc1" in area "wan"
TestCommand - 2019/12/06 06:34:54.682789 [INFO] agent: Started DNS server 127.0.0.1:17531 (udp)
TestCommand - 2019/12/06 06:34:54.682932 [INFO] agent: Started DNS server 127.0.0.1:17531 (tcp)
TestCommand - 2019/12/06 06:34:54.685558 [INFO] agent: Started HTTP server on 127.0.0.1:17532 (tcp)
TestCommand - 2019/12/06 06:34:54.685678 [INFO] agent: started state syncer
2019/12/06 06:34:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:34:54 [INFO]  raft: Node at 127.0.0.1:17536 [Candidate] entering Candidate state in term 2
2019/12/06 06:34:54 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:54 [INFO]  raft: Node at 127.0.0.1:17530 [Leader] entering Leader state
TestCommand_deny - 2019/12/06 06:34:54.823905 [INFO] consul: cluster leadership acquired
TestCommand_deny - 2019/12/06 06:34:54.824434 [INFO] consul: New leader elected: Node a8368153-0a96-eed9-905d-84ed6fb9d122
2019/12/06 06:34:55 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:34:55 [INFO]  raft: Node at 127.0.0.1:17536 [Leader] entering Leader state
TestCommand - 2019/12/06 06:34:55.257878 [INFO] consul: cluster leadership acquired
TestCommand - 2019/12/06 06:34:55.258339 [INFO] consul: New leader elected: Node d9fdf5c7-5a8c-c516-81b7-4c854ab5e151
TestCommand_deny - 2019/12/06 06:34:55.351199 [DEBUG] http: Request POST /v1/connect/intentions (261.418132ms) from=127.0.0.1:39096
TestCommand_deny - 2019/12/06 06:34:55.351618 [INFO] agent: Synced node info
TestCommand_deny - 2019/12/06 06:34:55.358530 [DEBUG] http: Request GET /v1/connect/intentions (1.097026ms) from=127.0.0.1:39098
TestCommand_deny - 2019/12/06 06:34:55.360886 [INFO] agent: Requesting shutdown
TestCommand_deny - 2019/12/06 06:34:55.361000 [INFO] consul: shutting down server
TestCommand_deny - 2019/12/06 06:34:55.361052 [WARN] serf: Shutdown without a Leave
TestCommand_deny - 2019/12/06 06:34:55.531846 [WARN] serf: Shutdown without a Leave
TestCommand_deny - 2019/12/06 06:34:55.535568 [DEBUG] agent: Node info in sync
TestCommand_deny - 2019/12/06 06:34:55.535683 [DEBUG] agent: Node info in sync
TestCommand_deny - 2019/12/06 06:34:55.607032 [INFO] manager: shutting down
TestCommand_deny - 2019/12/06 06:34:55.607585 [INFO] agent: consul server down
TestCommand_deny - 2019/12/06 06:34:55.607631 [INFO] agent: shutdown complete
TestCommand_deny - 2019/12/06 06:34:55.607676 [INFO] agent: Stopping DNS server 127.0.0.1:17525 (tcp)
TestCommand_deny - 2019/12/06 06:34:55.607797 [INFO] agent: Stopping DNS server 127.0.0.1:17525 (udp)
TestCommand_deny - 2019/12/06 06:34:55.607938 [INFO] agent: Stopping HTTP server 127.0.0.1:17526 (tcp)
TestCommand_deny - 2019/12/06 06:34:55.608517 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommand_deny - 2019/12/06 06:34:55.608568 [INFO] agent: Waiting for endpoints to shut down
TestCommand_deny - 2019/12/06 06:34:55.608670 [INFO] agent: Endpoints down
--- PASS: TestCommand_deny (2.83s)
TestCommand - 2019/12/06 06:34:55.611896 [DEBUG] http: Request POST /v1/connect/intentions (253.875955ms) from=127.0.0.1:50576
TestCommand - 2019/12/06 06:34:55.612761 [INFO] agent: Synced node info
TestCommand - 2019/12/06 06:34:55.612864 [DEBUG] agent: Node info in sync
TestCommand - 2019/12/06 06:34:55.618516 [DEBUG] http: Request GET /v1/connect/intentions (908.021µs) from=127.0.0.1:50578
TestCommand - 2019/12/06 06:34:55.622355 [INFO] agent: Requesting shutdown
TestCommand - 2019/12/06 06:34:55.622451 [INFO] consul: shutting down server
TestCommand - 2019/12/06 06:34:55.622494 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/06 06:34:55.773466 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/06 06:34:55.898903 [INFO] manager: shutting down
TestCommand - 2019/12/06 06:34:56.090292 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand - 2019/12/06 06:34:56.090535 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCommand - 2019/12/06 06:34:56.090570 [INFO] agent: consul server down
TestCommand - 2019/12/06 06:34:56.090600 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestCommand - 2019/12/06 06:34:56.090663 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestCommand - 2019/12/06 06:34:56.090711 [ERR] consul: failed to transfer leadership in 3 attempts
TestCommand - 2019/12/06 06:34:56.090616 [INFO] agent: shutdown complete
TestCommand - 2019/12/06 06:34:56.090857 [INFO] agent: Stopping DNS server 127.0.0.1:17531 (tcp)
TestCommand - 2019/12/06 06:34:56.091013 [INFO] agent: Stopping DNS server 127.0.0.1:17531 (udp)
TestCommand - 2019/12/06 06:34:56.091196 [INFO] agent: Stopping HTTP server 127.0.0.1:17532 (tcp)
TestCommand - 2019/12/06 06:34:56.092066 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/12/06 06:34:56.092197 [INFO] agent: Endpoints down
--- PASS: TestCommand (2.91s)
PASS
ok  	github.com/hashicorp/consul/command/intention/create	6.333s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== CONT  TestCommand_noTabs
--- PASS: TestCommand_noTabs (0.01s)
=== CONT  TestCommand
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/3_args
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/12/06 06:35:25.118500 [WARN] agent: Node name "Node 6e184947-2128-eafd-f2d0-5899984043b6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/12/06 06:35:25.120214 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/12/06 06:35:25.135734 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:35:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6e184947-2128-eafd-f2d0-5899984043b6 Address:127.0.0.1:34006}]
2019/12/06 06:35:26 [INFO]  raft: Node at 127.0.0.1:34006 [Follower] entering Follower state (Leader: "")
TestCommand - 2019/12/06 06:35:26.512056 [INFO] serf: EventMemberJoin: Node 6e184947-2128-eafd-f2d0-5899984043b6.dc1 127.0.0.1
TestCommand - 2019/12/06 06:35:26.516174 [INFO] serf: EventMemberJoin: Node 6e184947-2128-eafd-f2d0-5899984043b6 127.0.0.1
TestCommand - 2019/12/06 06:35:26.518944 [INFO] agent: Started DNS server 127.0.0.1:34001 (udp)
TestCommand - 2019/12/06 06:35:26.520743 [INFO] consul: Adding LAN server Node 6e184947-2128-eafd-f2d0-5899984043b6 (Addr: tcp/127.0.0.1:34006) (DC: dc1)
TestCommand - 2019/12/06 06:35:26.521030 [INFO] consul: Handled member-join event for server "Node 6e184947-2128-eafd-f2d0-5899984043b6.dc1" in area "wan"
TestCommand - 2019/12/06 06:35:26.521650 [INFO] agent: Started DNS server 127.0.0.1:34001 (tcp)
TestCommand - 2019/12/06 06:35:26.524939 [INFO] agent: Started HTTP server on 127.0.0.1:34002 (tcp)
TestCommand - 2019/12/06 06:35:26.525115 [INFO] agent: started state syncer
2019/12/06 06:35:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:35:26 [INFO]  raft: Node at 127.0.0.1:34006 [Candidate] entering Candidate state in term 2
2019/12/06 06:35:28 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:35:28 [INFO]  raft: Node at 127.0.0.1:34006 [Leader] entering Leader state
TestCommand - 2019/12/06 06:35:28.524442 [INFO] consul: cluster leadership acquired
TestCommand - 2019/12/06 06:35:28.524933 [INFO] consul: New leader elected: Node 6e184947-2128-eafd-f2d0-5899984043b6
TestCommand - 2019/12/06 06:35:29.075640 [INFO] agent: Synced node info
TestCommand - 2019/12/06 06:35:29.879535 [DEBUG] http: Request POST /v1/connect/intentions (1.05906851s) from=127.0.0.1:55978
TestCommand - 2019/12/06 06:35:29.899302 [DEBUG] http: Request GET /v1/connect/intentions (2.471058ms) from=127.0.0.1:55980
TestCommand - 2019/12/06 06:35:30.476733 [DEBUG] http: Request DELETE /v1/connect/intentions/41ddf9fd-8bdc-7102-9bb9-f4292512b7c0 (572.662433ms) from=127.0.0.1:55980
TestCommand - 2019/12/06 06:35:30.483506 [DEBUG] http: Request GET /v1/connect/intentions (1.306698ms) from=127.0.0.1:55978
TestCommand - 2019/12/06 06:35:30.487203 [INFO] agent: Requesting shutdown
TestCommand - 2019/12/06 06:35:30.487314 [INFO] consul: shutting down server
TestCommand - 2019/12/06 06:35:30.487363 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/06 06:35:30.740853 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/12/06 06:35:30.885899 [INFO] manager: shutting down
TestCommand - 2019/12/06 06:35:31.025843 [ERR] agent: failed to sync remote state: No cluster leader
TestCommand - 2019/12/06 06:35:31.040826 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestCommand - 2019/12/06 06:35:31.041153 [INFO] agent: consul server down
TestCommand - 2019/12/06 06:35:31.041214 [INFO] agent: shutdown complete
TestCommand - 2019/12/06 06:35:31.041276 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (tcp)
TestCommand - 2019/12/06 06:35:31.041423 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (udp)
TestCommand - 2019/12/06 06:35:31.041603 [INFO] agent: Stopping HTTP server 127.0.0.1:34002 (tcp)
TestCommand - 2019/12/06 06:35:31.042207 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/12/06 06:35:31.042329 [INFO] agent: Endpoints down
--- PASS: TestCommand (6.02s)
PASS
ok  	github.com/hashicorp/consul/command/intention/delete	7.106s
=== RUN   TestFinder
=== PAUSE TestFinder
=== CONT  TestFinder
WARNING: bootstrap = true: do not enable unless necessary
TestFinder - 2019/12/06 06:35:35.335017 [WARN] agent: Node name "Node 75f24088-9829-dec4-6ad7-2a89f1373e21" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestFinder - 2019/12/06 06:35:35.336502 [DEBUG] tlsutil: Update with version 1
TestFinder - 2019/12/06 06:35:35.343629 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:35:36 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:75f24088-9829-dec4-6ad7-2a89f1373e21 Address:127.0.0.1:19006}]
2019/12/06 06:35:36 [INFO]  raft: Node at 127.0.0.1:19006 [Follower] entering Follower state (Leader: "")
TestFinder - 2019/12/06 06:35:36.782948 [INFO] serf: EventMemberJoin: Node 75f24088-9829-dec4-6ad7-2a89f1373e21.dc1 127.0.0.1
TestFinder - 2019/12/06 06:35:36.788341 [INFO] serf: EventMemberJoin: Node 75f24088-9829-dec4-6ad7-2a89f1373e21 127.0.0.1
TestFinder - 2019/12/06 06:35:36.791386 [INFO] agent: Started DNS server 127.0.0.1:19001 (udp)
TestFinder - 2019/12/06 06:35:36.792497 [INFO] consul: Handled member-join event for server "Node 75f24088-9829-dec4-6ad7-2a89f1373e21.dc1" in area "wan"
TestFinder - 2019/12/06 06:35:36.795099 [INFO] consul: Adding LAN server Node 75f24088-9829-dec4-6ad7-2a89f1373e21 (Addr: tcp/127.0.0.1:19006) (DC: dc1)
TestFinder - 2019/12/06 06:35:36.795607 [INFO] agent: Started DNS server 127.0.0.1:19001 (tcp)
TestFinder - 2019/12/06 06:35:36.798295 [INFO] agent: Started HTTP server on 127.0.0.1:19002 (tcp)
TestFinder - 2019/12/06 06:35:36.798447 [INFO] agent: started state syncer
2019/12/06 06:35:36 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:35:36 [INFO]  raft: Node at 127.0.0.1:19006 [Candidate] entering Candidate state in term 2
2019/12/06 06:35:39 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:35:39 [INFO]  raft: Node at 127.0.0.1:19006 [Leader] entering Leader state
TestFinder - 2019/12/06 06:35:39.749741 [INFO] consul: cluster leadership acquired
TestFinder - 2019/12/06 06:35:39.750378 [INFO] consul: New leader elected: Node 75f24088-9829-dec4-6ad7-2a89f1373e21
TestFinder - 2019/12/06 06:35:40.275236 [INFO] agent: Synced node info
TestFinder - 2019/12/06 06:35:40.277635 [DEBUG] http: Request POST /v1/connect/intentions (316.235085ms) from=127.0.0.1:53064
TestFinder - 2019/12/06 06:35:40.285118 [DEBUG] http: Request GET /v1/connect/intentions (4.802446ms) from=127.0.0.1:53064
TestFinder - 2019/12/06 06:35:40.295012 [INFO] agent: Requesting shutdown
TestFinder - 2019/12/06 06:35:40.295097 [INFO] consul: shutting down server
TestFinder - 2019/12/06 06:35:40.295145 [WARN] serf: Shutdown without a Leave
TestFinder - 2019/12/06 06:35:40.482602 [WARN] serf: Shutdown without a Leave
TestFinder - 2019/12/06 06:35:40.657932 [INFO] manager: shutting down
TestFinder - 2019/12/06 06:35:40.907868 [INFO] agent: consul server down
TestFinder - 2019/12/06 06:35:40.907940 [INFO] agent: shutdown complete
TestFinder - 2019/12/06 06:35:40.908002 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (tcp)
TestFinder - 2019/12/06 06:35:40.908149 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (udp)
TestFinder - 2019/12/06 06:35:40.908303 [INFO] agent: Stopping HTTP server 127.0.0.1:19002 (tcp)
TestFinder - 2019/12/06 06:35:40.908828 [INFO] agent: Waiting for endpoints to shut down
TestFinder - 2019/12/06 06:35:40.908925 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestFinder - 2019/12/06 06:35:40.909057 [INFO] agent: Endpoints down
--- PASS: TestFinder (5.64s)
TestFinder - 2019/12/06 06:35:40.909223 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
PASS
ok  	github.com/hashicorp/consul/command/intention/finder	5.961s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_id
=== PAUSE TestCommand_id
=== RUN   TestCommand_srcDst
=== PAUSE TestCommand_srcDst
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_Validation
=== CONT  TestCommand_id
=== CONT  TestCommand_srcDst
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/3_args
--- PASS: TestCommand_noTabs (0.01s)
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_srcDst - 2019/12/06 06:35:51.081631 [WARN] agent: Node name "Node 79ad462a-d0b8-0cf6-83db-c4a6afc88f29" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_srcDst - 2019/12/06 06:35:51.082433 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_id - 2019/12/06 06:35:51.091502 [WARN] agent: Node name "Node 5088713b-e215-9a0b-834d-1b4a25ecf4ff" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_id - 2019/12/06 06:35:51.091937 [DEBUG] tlsutil: Update with version 1
TestCommand_srcDst - 2019/12/06 06:35:51.098383 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_id - 2019/12/06 06:35:51.105304 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:35:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5088713b-e215-9a0b-834d-1b4a25ecf4ff Address:127.0.0.1:35506}]
2019/12/06 06:35:52 [INFO]  raft: Node at 127.0.0.1:35506 [Follower] entering Follower state (Leader: "")
TestCommand_id - 2019/12/06 06:35:52.062411 [INFO] serf: EventMemberJoin: Node 5088713b-e215-9a0b-834d-1b4a25ecf4ff.dc1 127.0.0.1
2019/12/06 06:35:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:79ad462a-d0b8-0cf6-83db-c4a6afc88f29 Address:127.0.0.1:35512}]
2019/12/06 06:35:52 [INFO]  raft: Node at 127.0.0.1:35512 [Follower] entering Follower state (Leader: "")
TestCommand_id - 2019/12/06 06:35:52.067163 [INFO] serf: EventMemberJoin: Node 5088713b-e215-9a0b-834d-1b4a25ecf4ff 127.0.0.1
TestCommand_id - 2019/12/06 06:35:52.068202 [INFO] consul: Adding LAN server Node 5088713b-e215-9a0b-834d-1b4a25ecf4ff (Addr: tcp/127.0.0.1:35506) (DC: dc1)
TestCommand_id - 2019/12/06 06:35:52.068490 [INFO] consul: Handled member-join event for server "Node 5088713b-e215-9a0b-834d-1b4a25ecf4ff.dc1" in area "wan"
TestCommand_srcDst - 2019/12/06 06:35:52.068977 [INFO] serf: EventMemberJoin: Node 79ad462a-d0b8-0cf6-83db-c4a6afc88f29.dc1 127.0.0.1
TestCommand_id - 2019/12/06 06:35:52.069329 [INFO] agent: Started DNS server 127.0.0.1:35501 (tcp)
TestCommand_id - 2019/12/06 06:35:52.069674 [INFO] agent: Started DNS server 127.0.0.1:35501 (udp)
TestCommand_id - 2019/12/06 06:35:52.078805 [INFO] agent: Started HTTP server on 127.0.0.1:35502 (tcp)
TestCommand_id - 2019/12/06 06:35:52.078976 [INFO] agent: started state syncer
TestCommand_srcDst - 2019/12/06 06:35:52.080572 [INFO] serf: EventMemberJoin: Node 79ad462a-d0b8-0cf6-83db-c4a6afc88f29 127.0.0.1
TestCommand_srcDst - 2019/12/06 06:35:52.084488 [INFO] agent: Started DNS server 127.0.0.1:35507 (udp)
TestCommand_srcDst - 2019/12/06 06:35:52.085021 [INFO] consul: Adding LAN server Node 79ad462a-d0b8-0cf6-83db-c4a6afc88f29 (Addr: tcp/127.0.0.1:35512) (DC: dc1)
TestCommand_srcDst - 2019/12/06 06:35:52.089776 [INFO] consul: Handled member-join event for server "Node 79ad462a-d0b8-0cf6-83db-c4a6afc88f29.dc1" in area "wan"
TestCommand_srcDst - 2019/12/06 06:35:52.098008 [INFO] agent: Started DNS server 127.0.0.1:35507 (tcp)
TestCommand_srcDst - 2019/12/06 06:35:52.106593 [INFO] agent: Started HTTP server on 127.0.0.1:35508 (tcp)
TestCommand_srcDst - 2019/12/06 06:35:52.106747 [INFO] agent: started state syncer
2019/12/06 06:35:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:35:52 [INFO]  raft: Node at 127.0.0.1:35506 [Candidate] entering Candidate state in term 2
2019/12/06 06:35:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:35:52 [INFO]  raft: Node at 127.0.0.1:35512 [Candidate] entering Candidate state in term 2
2019/12/06 06:35:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:35:53 [INFO]  raft: Node at 127.0.0.1:35512 [Leader] entering Leader state
2019/12/06 06:35:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:35:53 [INFO]  raft: Node at 127.0.0.1:35506 [Leader] entering Leader state
TestCommand_srcDst - 2019/12/06 06:35:53.016716 [INFO] consul: cluster leadership acquired
TestCommand_srcDst - 2019/12/06 06:35:53.017254 [INFO] consul: New leader elected: Node 79ad462a-d0b8-0cf6-83db-c4a6afc88f29
TestCommand_id - 2019/12/06 06:35:53.017602 [INFO] consul: cluster leadership acquired
TestCommand_id - 2019/12/06 06:35:53.017979 [INFO] consul: New leader elected: Node 5088713b-e215-9a0b-834d-1b4a25ecf4ff
TestCommand_id - 2019/12/06 06:35:53.411087 [DEBUG] http: Request POST /v1/connect/intentions (208.097881ms) from=127.0.0.1:55472
TestCommand_id - 2019/12/06 06:35:53.414806 [INFO] agent: Synced node info
TestCommand_id - 2019/12/06 06:35:53.414943 [DEBUG] agent: Node info in sync
TestCommand_id - 2019/12/06 06:35:53.479542 [DEBUG] http: Request GET /v1/connect/intentions/a2ebf76e-b679-299d-960d-8a398528bf0b (14.305669ms) from=127.0.0.1:55474
TestCommand_id - 2019/12/06 06:35:53.491299 [INFO] agent: Requesting shutdown
TestCommand_id - 2019/12/06 06:35:53.491410 [INFO] consul: shutting down server
TestCommand_id - 2019/12/06 06:35:53.491461 [WARN] serf: Shutdown without a Leave
TestCommand_id - 2019/12/06 06:35:53.632932 [WARN] serf: Shutdown without a Leave
TestCommand_srcDst - 2019/12/06 06:35:53.634818 [INFO] agent: Synced node info
TestCommand_srcDst - 2019/12/06 06:35:53.634932 [DEBUG] agent: Node info in sync
TestCommand_srcDst - 2019/12/06 06:35:53.636225 [DEBUG] http: Request POST /v1/connect/intentions (489.903825ms) from=127.0.0.1:44362
TestCommand_srcDst - 2019/12/06 06:35:53.642806 [DEBUG] http: Request GET /v1/connect/intentions (1.223696ms) from=127.0.0.1:44368
TestCommand_srcDst - 2019/12/06 06:35:53.647465 [DEBUG] http: Request GET /v1/connect/intentions/b3001579-1da0-01b4-3249-f4785840c67d (787.352µs) from=127.0.0.1:44368
TestCommand_srcDst - 2019/12/06 06:35:53.650042 [INFO] agent: Requesting shutdown
TestCommand_srcDst - 2019/12/06 06:35:53.650307 [INFO] consul: shutting down server
TestCommand_srcDst - 2019/12/06 06:35:53.650471 [WARN] serf: Shutdown without a Leave
TestCommand_id - 2019/12/06 06:35:53.725619 [INFO] manager: shutting down
TestCommand_srcDst - 2019/12/06 06:35:53.726150 [WARN] serf: Shutdown without a Leave
TestCommand_srcDst - 2019/12/06 06:35:54.033376 [INFO] manager: shutting down
TestCommand_id - 2019/12/06 06:35:54.109238 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_id - 2019/12/06 06:35:54.109490 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestCommand_id - 2019/12/06 06:35:54.109549 [INFO] agent: consul server down
TestCommand_id - 2019/12/06 06:35:54.109598 [INFO] agent: shutdown complete
TestCommand_id - 2019/12/06 06:35:54.109674 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (tcp)
TestCommand_id - 2019/12/06 06:35:54.109847 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (udp)
TestCommand_id - 2019/12/06 06:35:54.110029 [INFO] agent: Stopping HTTP server 127.0.0.1:35502 (tcp)
TestCommand_id - 2019/12/06 06:35:54.109568 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestCommand_id - 2019/12/06 06:35:54.110806 [INFO] agent: Waiting for endpoints to shut down
TestCommand_id - 2019/12/06 06:35:54.110936 [INFO] agent: Endpoints down
--- PASS: TestCommand_id (3.19s)
TestCommand_srcDst - 2019/12/06 06:35:54.182860 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_srcDst - 2019/12/06 06:35:54.183177 [INFO] agent: consul server down
TestCommand_srcDst - 2019/12/06 06:35:54.183229 [INFO] agent: shutdown complete
TestCommand_srcDst - 2019/12/06 06:35:54.183285 [INFO] agent: Stopping DNS server 127.0.0.1:35507 (tcp)
TestCommand_srcDst - 2019/12/06 06:35:54.183431 [INFO] agent: Stopping DNS server 127.0.0.1:35507 (udp)
TestCommand_srcDst - 2019/12/06 06:35:54.183588 [INFO] agent: Stopping HTTP server 127.0.0.1:35508 (tcp)
TestCommand_srcDst - 2019/12/06 06:35:54.184849 [INFO] agent: Waiting for endpoints to shut down
TestCommand_srcDst - 2019/12/06 06:35:54.184957 [INFO] agent: Endpoints down
--- PASS: TestCommand_srcDst (3.27s)
PASS
ok  	github.com/hashicorp/consul/command/intention/get	3.803s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_matchDst
=== PAUSE TestCommand_matchDst
=== RUN   TestCommand_matchSource
=== PAUSE TestCommand_matchSource
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_matchDst
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/3_args
=== RUN   TestCommand_Validation/both_source_and_dest
=== CONT  TestCommand_matchSource
--- PASS: TestCommand_noTabs (0.00s)
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
    --- PASS: TestCommand_Validation/both_source_and_dest (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_matchSource - 2019/12/06 06:35:56.500652 [WARN] agent: Node name "Node bc29814d-6632-2c45-2ea3-7ca53545ee2a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_matchSource - 2019/12/06 06:35:56.501559 [DEBUG] tlsutil: Update with version 1
TestCommand_matchSource - 2019/12/06 06:35:56.508273 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_matchDst - 2019/12/06 06:35:56.565928 [WARN] agent: Node name "Node 81fc1f6c-3083-0c65-a22d-b6a16bf4d715" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_matchDst - 2019/12/06 06:35:56.566610 [DEBUG] tlsutil: Update with version 1
TestCommand_matchDst - 2019/12/06 06:35:56.569083 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:35:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bc29814d-6632-2c45-2ea3-7ca53545ee2a Address:127.0.0.1:22012}]
2019/12/06 06:35:57 [INFO]  raft: Node at 127.0.0.1:22012 [Follower] entering Follower state (Leader: "")
2019/12/06 06:35:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:81fc1f6c-3083-0c65-a22d-b6a16bf4d715 Address:127.0.0.1:22006}]
2019/12/06 06:35:57 [INFO]  raft: Node at 127.0.0.1:22006 [Follower] entering Follower state (Leader: "")
TestCommand_matchDst - 2019/12/06 06:35:57.774854 [INFO] serf: EventMemberJoin: Node 81fc1f6c-3083-0c65-a22d-b6a16bf4d715.dc1 127.0.0.1
TestCommand_matchSource - 2019/12/06 06:35:57.778906 [INFO] serf: EventMemberJoin: Node bc29814d-6632-2c45-2ea3-7ca53545ee2a.dc1 127.0.0.1
TestCommand_matchSource - 2019/12/06 06:35:57.788925 [INFO] serf: EventMemberJoin: Node bc29814d-6632-2c45-2ea3-7ca53545ee2a 127.0.0.1
TestCommand_matchSource - 2019/12/06 06:35:57.793398 [INFO] agent: Started DNS server 127.0.0.1:22007 (udp)
TestCommand_matchSource - 2019/12/06 06:35:57.795969 [INFO] agent: Started DNS server 127.0.0.1:22007 (tcp)
TestCommand_matchSource - 2019/12/06 06:35:57.794270 [INFO] consul: Handled member-join event for server "Node bc29814d-6632-2c45-2ea3-7ca53545ee2a.dc1" in area "wan"
TestCommand_matchDst - 2019/12/06 06:35:57.791072 [INFO] serf: EventMemberJoin: Node 81fc1f6c-3083-0c65-a22d-b6a16bf4d715 127.0.0.1
TestCommand_matchDst - 2019/12/06 06:35:57.801040 [INFO] agent: Started DNS server 127.0.0.1:22001 (udp)
TestCommand_matchDst - 2019/12/06 06:35:57.801527 [INFO] consul: Adding LAN server Node 81fc1f6c-3083-0c65-a22d-b6a16bf4d715 (Addr: tcp/127.0.0.1:22006) (DC: dc1)
TestCommand_matchDst - 2019/12/06 06:35:57.801740 [INFO] consul: Handled member-join event for server "Node 81fc1f6c-3083-0c65-a22d-b6a16bf4d715.dc1" in area "wan"
TestCommand_matchDst - 2019/12/06 06:35:57.802277 [INFO] agent: Started DNS server 127.0.0.1:22001 (tcp)
TestCommand_matchDst - 2019/12/06 06:35:57.805875 [INFO] agent: Started HTTP server on 127.0.0.1:22002 (tcp)
TestCommand_matchSource - 2019/12/06 06:35:57.795843 [INFO] consul: Adding LAN server Node bc29814d-6632-2c45-2ea3-7ca53545ee2a (Addr: tcp/127.0.0.1:22012) (DC: dc1)
TestCommand_matchDst - 2019/12/06 06:35:57.806521 [INFO] agent: started state syncer
TestCommand_matchSource - 2019/12/06 06:35:57.808151 [INFO] agent: Started HTTP server on 127.0.0.1:22008 (tcp)
TestCommand_matchSource - 2019/12/06 06:35:57.808720 [INFO] agent: started state syncer
2019/12/06 06:35:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:35:57 [INFO]  raft: Node at 127.0.0.1:22006 [Candidate] entering Candidate state in term 2
2019/12/06 06:35:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:35:57 [INFO]  raft: Node at 127.0.0.1:22012 [Candidate] entering Candidate state in term 2
2019/12/06 06:35:58 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:35:58 [INFO]  raft: Node at 127.0.0.1:22012 [Leader] entering Leader state
TestCommand_matchSource - 2019/12/06 06:35:58.341578 [INFO] consul: cluster leadership acquired
TestCommand_matchSource - 2019/12/06 06:35:58.342198 [INFO] consul: New leader elected: Node bc29814d-6632-2c45-2ea3-7ca53545ee2a
2019/12/06 06:35:58 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:35:58 [INFO]  raft: Node at 127.0.0.1:22006 [Leader] entering Leader state
TestCommand_matchDst - 2019/12/06 06:35:58.343200 [INFO] consul: cluster leadership acquired
TestCommand_matchDst - 2019/12/06 06:35:58.343647 [INFO] consul: New leader elected: Node 81fc1f6c-3083-0c65-a22d-b6a16bf4d715
TestCommand_matchDst - 2019/12/06 06:35:58.657590 [INFO] agent: Synced node info
TestCommand_matchDst - 2019/12/06 06:35:58.657728 [DEBUG] agent: Node info in sync
TestCommand_matchDst - 2019/12/06 06:35:58.671024 [DEBUG] http: Request POST /v1/connect/intentions (236.668552ms) from=127.0.0.1:43686
TestCommand_matchSource - 2019/12/06 06:35:58.827443 [DEBUG] http: Request POST /v1/connect/intentions (286.698059ms) from=127.0.0.1:52282
TestCommand_matchSource - 2019/12/06 06:35:58.828620 [INFO] agent: Synced node info
TestCommand_matchDst - 2019/12/06 06:35:59.186583 [DEBUG] http: Request POST /v1/connect/intentions (508.455927ms) from=127.0.0.1:43686
TestCommand_matchSource - 2019/12/06 06:35:59.543315 [DEBUG] http: Request POST /v1/connect/intentions (710.541668ms) from=127.0.0.1:52282
TestCommand_matchDst - 2019/12/06 06:35:59.752359 [DEBUG] http: Request POST /v1/connect/intentions (561.500838ms) from=127.0.0.1:43686
TestCommand_matchDst - 2019/12/06 06:35:59.833570 [DEBUG] http: Request GET /v1/connect/intentions/match?by=destination&name=db (23.394882ms) from=127.0.0.1:43690
TestCommand_matchDst - 2019/12/06 06:35:59.838637 [INFO] agent: Requesting shutdown
TestCommand_matchDst - 2019/12/06 06:35:59.838744 [INFO] consul: shutting down server
TestCommand_matchDst - 2019/12/06 06:35:59.838792 [WARN] serf: Shutdown without a Leave
TestCommand_matchDst - 2019/12/06 06:35:59.941263 [WARN] serf: Shutdown without a Leave
TestCommand_matchDst - 2019/12/06 06:36:00.007895 [INFO] manager: shutting down
TestCommand_matchDst - 2019/12/06 06:36:00.008733 [INFO] agent: consul server down
TestCommand_matchDst - 2019/12/06 06:36:00.008790 [INFO] agent: shutdown complete
TestCommand_matchDst - 2019/12/06 06:36:00.008846 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (tcp)
TestCommand_matchDst - 2019/12/06 06:36:00.009071 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (udp)
TestCommand_matchDst - 2019/12/06 06:36:00.009522 [INFO] agent: Stopping HTTP server 127.0.0.1:22002 (tcp)
TestCommand_matchSource - 2019/12/06 06:36:00.009677 [DEBUG] http: Request POST /v1/connect/intentions (463.585874ms) from=127.0.0.1:52282
TestCommand_matchDst - 2019/12/06 06:36:00.010545 [INFO] agent: Waiting for endpoints to shut down
TestCommand_matchDst - 2019/12/06 06:36:00.010627 [INFO] agent: Endpoints down
--- PASS: TestCommand_matchDst (3.70s)
TestCommand_matchSource - 2019/12/06 06:36:00.017957 [DEBUG] http: Request GET /v1/connect/intentions/match?by=source&name=foo (1.350698ms) from=127.0.0.1:52286
TestCommand_matchSource - 2019/12/06 06:36:00.020339 [INFO] agent: Requesting shutdown
TestCommand_matchSource - 2019/12/06 06:36:00.020443 [INFO] consul: shutting down server
TestCommand_matchSource - 2019/12/06 06:36:00.020510 [WARN] serf: Shutdown without a Leave
TestCommand_matchDst - 2019/12/06 06:36:00.047101 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestCommand_matchSource - 2019/12/06 06:36:00.191188 [WARN] serf: Shutdown without a Leave
TestCommand_matchSource - 2019/12/06 06:36:00.324630 [INFO] manager: shutting down
TestCommand_matchSource - 2019/12/06 06:36:00.441455 [INFO] agent: consul server down
TestCommand_matchSource - 2019/12/06 06:36:00.441527 [INFO] agent: shutdown complete
TestCommand_matchSource - 2019/12/06 06:36:00.441586 [INFO] agent: Stopping DNS server 127.0.0.1:22007 (tcp)
TestCommand_matchSource - 2019/12/06 06:36:00.441725 [INFO] agent: Stopping DNS server 127.0.0.1:22007 (udp)
TestCommand_matchSource - 2019/12/06 06:36:00.441871 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestCommand_matchSource - 2019/12/06 06:36:00.442145 [INFO] agent: Stopping HTTP server 127.0.0.1:22008 (tcp)
TestCommand_matchSource - 2019/12/06 06:36:00.442868 [INFO] agent: Waiting for endpoints to shut down
TestCommand_matchSource - 2019/12/06 06:36:00.442959 [INFO] agent: Endpoints down
--- PASS: TestCommand_matchSource (4.12s)
PASS
ok  	github.com/hashicorp/consul/command/intention/match	4.542s
=== RUN   TestJoinCommand_noTabs
=== PAUSE TestJoinCommand_noTabs
=== RUN   TestJoinCommandJoin_lan
=== PAUSE TestJoinCommandJoin_lan
=== RUN   TestJoinCommand_wan
=== PAUSE TestJoinCommand_wan
=== RUN   TestJoinCommand_noAddrs
=== PAUSE TestJoinCommand_noAddrs
=== CONT  TestJoinCommand_noTabs
=== CONT  TestJoinCommand_wan
=== CONT  TestJoinCommand_noAddrs
--- PASS: TestJoinCommand_noAddrs (0.00s)
=== CONT  TestJoinCommandJoin_lan
--- PASS: TestJoinCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommand_wan - 2019/12/06 06:36:26.647087 [WARN] agent: Node name "Node 08ed1afd-fc46-3171-e266-def9417f876d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommand_wan - 2019/12/06 06:36:26.650368 [DEBUG] tlsutil: Update with version 1
TestJoinCommand_wan - 2019/12/06 06:36:26.667822 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommandJoin_lan - 2019/12/06 06:36:26.706961 [WARN] agent: Node name "Node 6860f072-5355-f5ec-7d98-e6b5f55556fc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommandJoin_lan - 2019/12/06 06:36:26.707968 [DEBUG] tlsutil: Update with version 1
TestJoinCommandJoin_lan - 2019/12/06 06:36:26.712594 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:36:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:08ed1afd-fc46-3171-e266-def9417f876d Address:127.0.0.1:29506}]
2019/12/06 06:36:28 [INFO]  raft: Node at 127.0.0.1:29506 [Follower] entering Follower state (Leader: "")
TestJoinCommand_wan - 2019/12/06 06:36:28.313614 [INFO] serf: EventMemberJoin: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1 127.0.0.1
TestJoinCommand_wan - 2019/12/06 06:36:28.327089 [INFO] serf: EventMemberJoin: Node 08ed1afd-fc46-3171-e266-def9417f876d 127.0.0.1
TestJoinCommand_wan - 2019/12/06 06:36:28.329856 [INFO] agent: Started DNS server 127.0.0.1:29501 (udp)
TestJoinCommand_wan - 2019/12/06 06:36:28.330867 [INFO] consul: Adding LAN server Node 08ed1afd-fc46-3171-e266-def9417f876d (Addr: tcp/127.0.0.1:29506) (DC: dc1)
TestJoinCommand_wan - 2019/12/06 06:36:28.331388 [INFO] consul: Handled member-join event for server "Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1" in area "wan"
TestJoinCommand_wan - 2019/12/06 06:36:28.331815 [INFO] agent: Started DNS server 127.0.0.1:29501 (tcp)
TestJoinCommand_wan - 2019/12/06 06:36:28.334755 [INFO] agent: Started HTTP server on 127.0.0.1:29502 (tcp)
TestJoinCommand_wan - 2019/12/06 06:36:28.335602 [INFO] agent: started state syncer
2019/12/06 06:36:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:28 [INFO]  raft: Node at 127.0.0.1:29506 [Candidate] entering Candidate state in term 2
2019/12/06 06:36:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6860f072-5355-f5ec-7d98-e6b5f55556fc Address:127.0.0.1:29512}]
2019/12/06 06:36:28 [INFO]  raft: Node at 127.0.0.1:29512 [Follower] entering Follower state (Leader: "")
TestJoinCommandJoin_lan - 2019/12/06 06:36:28.413253 [INFO] serf: EventMemberJoin: Node 6860f072-5355-f5ec-7d98-e6b5f55556fc.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/06 06:36:28.416987 [INFO] serf: EventMemberJoin: Node 6860f072-5355-f5ec-7d98-e6b5f55556fc 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/06 06:36:28.417934 [INFO] consul: Handled member-join event for server "Node 6860f072-5355-f5ec-7d98-e6b5f55556fc.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/12/06 06:36:28.418253 [INFO] consul: Adding LAN server Node 6860f072-5355-f5ec-7d98-e6b5f55556fc (Addr: tcp/127.0.0.1:29512) (DC: dc1)
TestJoinCommandJoin_lan - 2019/12/06 06:36:28.418664 [INFO] agent: Started DNS server 127.0.0.1:29507 (udp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:28.418879 [INFO] agent: Started DNS server 127.0.0.1:29507 (tcp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:28.421353 [INFO] agent: Started HTTP server on 127.0.0.1:29508 (tcp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:28.421466 [INFO] agent: started state syncer
2019/12/06 06:36:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:28 [INFO]  raft: Node at 127.0.0.1:29512 [Candidate] entering Candidate state in term 2
2019/12/06 06:36:29 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:29 [INFO]  raft: Node at 127.0.0.1:29506 [Leader] entering Leader state
TestJoinCommand_wan - 2019/12/06 06:36:29.242412 [INFO] consul: cluster leadership acquired
TestJoinCommand_wan - 2019/12/06 06:36:29.242976 [INFO] consul: New leader elected: Node 08ed1afd-fc46-3171-e266-def9417f876d
2019/12/06 06:36:29 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:29 [INFO]  raft: Node at 127.0.0.1:29512 [Leader] entering Leader state
TestJoinCommandJoin_lan - 2019/12/06 06:36:29.317822 [INFO] consul: cluster leadership acquired
TestJoinCommandJoin_lan - 2019/12/06 06:36:29.318556 [INFO] consul: New leader elected: Node 6860f072-5355-f5ec-7d98-e6b5f55556fc
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommandJoin_lan - 2019/12/06 06:36:29.557199 [WARN] agent: Node name "Node 1b851efd-4b64-2a32-2f4f-5c7738bea518" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommandJoin_lan - 2019/12/06 06:36:29.557783 [DEBUG] tlsutil: Update with version 1
TestJoinCommandJoin_lan - 2019/12/06 06:36:29.563646 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommand_wan - 2019/12/06 06:36:29.618663 [WARN] agent: Node name "Node b9060d46-58a1-7995-edd5-2fa742b4b041" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommand_wan - 2019/12/06 06:36:29.619130 [DEBUG] tlsutil: Update with version 1
TestJoinCommand_wan - 2019/12/06 06:36:29.627120 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.144738 [INFO] agent: Synced node info
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.144844 [DEBUG] agent: Node info in sync
TestJoinCommand_wan - 2019/12/06 06:36:30.617689 [INFO] agent: Synced node info
TestJoinCommand_wan - 2019/12/06 06:36:30.891840 [DEBUG] agent: Node info in sync
TestJoinCommand_wan - 2019/12/06 06:36:30.891965 [DEBUG] agent: Node info in sync
2019/12/06 06:36:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b9060d46-58a1-7995-edd5-2fa742b4b041 Address:127.0.0.1:29524}]
2019/12/06 06:36:30 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1b851efd-4b64-2a32-2f4f-5c7738bea518 Address:127.0.0.1:29518}]
2019/12/06 06:36:30 [INFO]  raft: Node at 127.0.0.1:29524 [Follower] entering Follower state (Leader: "")
2019/12/06 06:36:30 [INFO]  raft: Node at 127.0.0.1:29518 [Follower] entering Follower state (Leader: "")
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.965240 [INFO] serf: EventMemberJoin: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1 127.0.0.1
TestJoinCommand_wan - 2019/12/06 06:36:30.965240 [INFO] serf: EventMemberJoin: Node b9060d46-58a1-7995-edd5-2fa742b4b041.dc1 127.0.0.1
TestJoinCommand_wan - 2019/12/06 06:36:30.971876 [INFO] serf: EventMemberJoin: Node b9060d46-58a1-7995-edd5-2fa742b4b041 127.0.0.1
TestJoinCommand_wan - 2019/12/06 06:36:30.976166 [INFO] consul: Handled member-join event for server "Node b9060d46-58a1-7995-edd5-2fa742b4b041.dc1" in area "wan"
TestJoinCommand_wan - 2019/12/06 06:36:30.976618 [INFO] consul: Adding LAN server Node b9060d46-58a1-7995-edd5-2fa742b4b041 (Addr: tcp/127.0.0.1:29524) (DC: dc1)
TestJoinCommand_wan - 2019/12/06 06:36:30.978627 [INFO] agent: Started DNS server 127.0.0.1:29519 (tcp)
TestJoinCommand_wan - 2019/12/06 06:36:30.979117 [INFO] agent: Started DNS server 127.0.0.1:29519 (udp)
TestJoinCommand_wan - 2019/12/06 06:36:30.981971 [INFO] agent: Started HTTP server on 127.0.0.1:29520 (tcp)
TestJoinCommand_wan - 2019/12/06 06:36:30.982217 [INFO] agent: started state syncer
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.984338 [INFO] serf: EventMemberJoin: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.985994 [INFO] agent: Started DNS server 127.0.0.1:29513 (udp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.986614 [INFO] consul: Adding LAN server Node 1b851efd-4b64-2a32-2f4f-5c7738bea518 (Addr: tcp/127.0.0.1:29518) (DC: dc1)
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.986866 [INFO] consul: Handled member-join event for server "Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.987938 [INFO] agent: Started DNS server 127.0.0.1:29513 (tcp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.991765 [INFO] agent: Started HTTP server on 127.0.0.1:29514 (tcp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:30.991882 [INFO] agent: started state syncer
2019/12/06 06:36:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:31 [INFO]  raft: Node at 127.0.0.1:29524 [Candidate] entering Candidate state in term 2
2019/12/06 06:36:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:31 [INFO]  raft: Node at 127.0.0.1:29518 [Candidate] entering Candidate state in term 2
TestJoinCommandJoin_lan - 2019/12/06 06:36:31.384318 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestJoinCommandJoin_lan - 2019/12/06 06:36:31.384819 [DEBUG] consul: Skipping self join check for "Node 6860f072-5355-f5ec-7d98-e6b5f55556fc" since the cluster is too small
TestJoinCommandJoin_lan - 2019/12/06 06:36:31.384995 [INFO] consul: member 'Node 6860f072-5355-f5ec-7d98-e6b5f55556fc' joined, marking health alive
TestJoinCommandJoin_lan - 2019/12/06 06:36:31.425348 [DEBUG] agent: Node info in sync
TestJoinCommand_wan - 2019/12/06 06:36:31.901180 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestJoinCommand_wan - 2019/12/06 06:36:31.901972 [DEBUG] consul: Skipping self join check for "Node 08ed1afd-fc46-3171-e266-def9417f876d" since the cluster is too small
TestJoinCommand_wan - 2019/12/06 06:36:31.902241 [INFO] consul: member 'Node 08ed1afd-fc46-3171-e266-def9417f876d' joined, marking health alive
2019/12/06 06:36:31 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:31 [INFO]  raft: Node at 127.0.0.1:29518 [Leader] entering Leader state
TestJoinCommandJoin_lan - 2019/12/06 06:36:31.907763 [INFO] consul: cluster leadership acquired
TestJoinCommandJoin_lan - 2019/12/06 06:36:31.908273 [INFO] consul: New leader elected: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.091554 [INFO] agent: (LAN) joining: [127.0.0.1:29516]
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.092868 [DEBUG] memberlist: Stream connection from=127.0.0.1:38726
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.093207 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:29516
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.128389 [INFO] serf: EventMemberJoin: Node 6860f072-5355-f5ec-7d98-e6b5f55556fc 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.129297 [INFO] consul: Adding LAN server Node 6860f072-5355-f5ec-7d98-e6b5f55556fc (Addr: tcp/127.0.0.1:29512) (DC: dc1)
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.131340 [DEBUG] memberlist: Stream connection from=127.0.0.1:36530
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.131708 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:29511
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.137230 [INFO] serf: EventMemberJoin: Node 6860f072-5355-f5ec-7d98-e6b5f55556fc.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.137869 [DEBUG] consul: Successfully performed flood-join for "Node 6860f072-5355-f5ec-7d98-e6b5f55556fc" at 127.0.0.1:29511
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.138192 [INFO] consul: Handled member-join event for server "Node 6860f072-5355-f5ec-7d98-e6b5f55556fc.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.140002 [INFO] serf: EventMemberJoin: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.140900 [INFO] serf: EventMemberJoin: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518 127.0.0.1
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.141065 [INFO] consul: Handled member-join event for server "Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.141544 [INFO] consul: New leader elected: Node 6860f072-5355-f5ec-7d98-e6b5f55556fc
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.141837 [INFO] consul: Adding LAN server Node 1b851efd-4b64-2a32-2f4f-5c7738bea518 (Addr: tcp/127.0.0.1:29518) (DC: dc1)
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.142379 [ERR] consul: 'Node 1b851efd-4b64-2a32-2f4f-5c7738bea518' and 'Node 6860f072-5355-f5ec-7d98-e6b5f55556fc' are both in bootstrap mode. Only one node should be in bootstrap mode, not adding Raft peer.
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.142540 [INFO] consul: member 'Node 1b851efd-4b64-2a32-2f4f-5c7738bea518' joined, marking health alive
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.146746 [INFO] agent: (LAN) joined: 1
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.146849 [DEBUG] agent: systemd notify failed: No socket
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.146947 [DEBUG] http: Request PUT /v1/agent/join/127.0.0.1:29516 (55.4143ms) from=127.0.0.1:40952
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.147967 [INFO] agent: Requesting shutdown
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.148042 [INFO] consul: shutting down server
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.148097 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.148464 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/06 06:36:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:32 [INFO]  raft: Node at 127.0.0.1:29524 [Leader] entering Leader state
TestJoinCommand_wan - 2019/12/06 06:36:32.332496 [INFO] consul: cluster leadership acquired
TestJoinCommand_wan - 2019/12/06 06:36:32.332999 [INFO] consul: New leader elected: Node b9060d46-58a1-7995-edd5-2fa742b4b041
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.471891 [DEBUG] serf: messageJoinType: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1
TestJoinCommand_wan - 2019/12/06 06:36:32.556275 [INFO] agent: (WAN) joining: [127.0.0.1:29523]
TestJoinCommand_wan - 2019/12/06 06:36:32.557136 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:29523
TestJoinCommand_wan - 2019/12/06 06:36:32.557190 [DEBUG] memberlist: Stream connection from=127.0.0.1:43942
TestJoinCommand_wan - 2019/12/06 06:36:32.560376 [INFO] serf: EventMemberJoin: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1 127.0.0.1
TestJoinCommand_wan - 2019/12/06 06:36:32.560810 [INFO] serf: EventMemberJoin: Node b9060d46-58a1-7995-edd5-2fa742b4b041.dc1 127.0.0.1
TestJoinCommand_wan - 2019/12/06 06:36:32.560820 [INFO] consul: Handled member-join event for server "Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1" in area "wan"
TestJoinCommand_wan - 2019/12/06 06:36:32.561261 [INFO] agent: (WAN) joined: 1
TestJoinCommand_wan - 2019/12/06 06:36:32.561342 [DEBUG] http: Request PUT /v1/agent/join/127.0.0.1:29523?wan=1 (5.09012ms) from=127.0.0.1:32948
TestJoinCommand_wan - 2019/12/06 06:36:32.561995 [INFO] agent: Requesting shutdown
TestJoinCommand_wan - 2019/12/06 06:36:32.562072 [INFO] consul: shutting down server
TestJoinCommand_wan - 2019/12/06 06:36:32.562129 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/12/06 06:36:32.561878 [INFO] consul: Handled member-join event for server "Node b9060d46-58a1-7995-edd5-2fa742b4b041.dc1" in area "wan"
TestJoinCommand_wan - 2019/12/06 06:36:32.816989 [DEBUG] serf: messageJoinType: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.915759 [DEBUG] serf: messageJoinType: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1
TestJoinCommand_wan - 2019/12/06 06:36:32.968833 [DEBUG] serf: messageJoinType: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1
TestJoinCommandJoin_lan - 2019/12/06 06:36:32.971599 [DEBUG] serf: messageJoinType: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1
TestJoinCommand_wan - 2019/12/06 06:36:33.320728 [DEBUG] serf: messageJoinType: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1
TestJoinCommand_wan - 2019/12/06 06:36:33.321330 [DEBUG] serf: messageJoinType: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1
TestJoinCommand_wan - 2019/12/06 06:36:33.321454 [DEBUG] serf: messageJoinType: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1
TestJoinCommandJoin_lan - 2019/12/06 06:36:33.419267 [DEBUG] serf: messageJoinType: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1
TestJoinCommandJoin_lan - 2019/12/06 06:36:33.419414 [DEBUG] serf: messageJoinType: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1
TestJoinCommandJoin_lan - 2019/12/06 06:36:33.420908 [DEBUG] serf: messageJoinType: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1
TestJoinCommand_wan - 2019/12/06 06:36:33.469365 [DEBUG] serf: messageJoinType: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1
TestJoinCommandJoin_lan - 2019/12/06 06:36:33.472113 [DEBUG] serf: messageJoinType: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1
TestJoinCommand_wan - 2019/12/06 06:36:33.824737 [DEBUG] serf: messageJoinType: Node 08ed1afd-fc46-3171-e266-def9417f876d.dc1
TestJoinCommandJoin_lan - 2019/12/06 06:36:33.914523 [DEBUG] serf: messageJoinType: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518.dc1
TestJoinCommandJoin_lan - 2019/12/06 06:36:33.922639 [DEBUG] memberlist: Failed ping: Node 1b851efd-4b64-2a32-2f4f-5c7738bea518 (timeout reached)
TestJoinCommandJoin_lan - 2019/12/06 06:36:33.958466 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/12/06 06:36:33.961091 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommand_wan - 2019/12/06 06:36:33.966659 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.052750 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.053524 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.055834 [WARN] consul: error getting server health from "Node 1b851efd-4b64-2a32-2f4f-5c7738bea518": rpc error making call: EOF
TestJoinCommand_wan - 2019/12/06 06:36:34.241803 [INFO] manager: shutting down
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.241803 [INFO] manager: shutting down
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.241905 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.241960 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.241979 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.242233 [INFO] agent: consul server down
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.242299 [INFO] agent: shutdown complete
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.242362 [INFO] agent: Stopping DNS server 127.0.0.1:29513 (tcp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.242532 [INFO] agent: Stopping DNS server 127.0.0.1:29513 (udp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.242724 [INFO] agent: Stopping HTTP server 127.0.0.1:29514 (tcp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.243008 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.243098 [INFO] agent: Endpoints down
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.243148 [INFO] agent: Requesting shutdown
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.243204 [INFO] consul: shutting down server
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.243247 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/12/06 06:36:34.341773 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestJoinCommand_wan - 2019/12/06 06:36:34.341852 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestJoinCommand_wan - 2019/12/06 06:36:34.341921 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestJoinCommand_wan - 2019/12/06 06:36:34.342060 [INFO] agent: consul server down
TestJoinCommand_wan - 2019/12/06 06:36:34.342115 [INFO] agent: shutdown complete
TestJoinCommand_wan - 2019/12/06 06:36:34.342166 [INFO] agent: Stopping DNS server 127.0.0.1:29519 (tcp)
TestJoinCommand_wan - 2019/12/06 06:36:34.342302 [INFO] agent: Stopping DNS server 127.0.0.1:29519 (udp)
TestJoinCommand_wan - 2019/12/06 06:36:34.342451 [INFO] agent: Stopping HTTP server 127.0.0.1:29520 (tcp)
TestJoinCommand_wan - 2019/12/06 06:36:34.342662 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommand_wan - 2019/12/06 06:36:34.342749 [INFO] agent: Endpoints down
TestJoinCommand_wan - 2019/12/06 06:36:34.342791 [INFO] agent: Requesting shutdown
TestJoinCommand_wan - 2019/12/06 06:36:34.342851 [INFO] consul: shutting down server
TestJoinCommand_wan - 2019/12/06 06:36:34.342894 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.421734 [INFO] memberlist: Suspect Node 1b851efd-4b64-2a32-2f4f-5c7738bea518 has failed, no acks received
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.458401 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/12/06 06:36:34.459420 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.516839 [INFO] manager: shutting down
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.517726 [INFO] agent: consul server down
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.517793 [INFO] agent: shutdown complete
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.517854 [INFO] agent: Stopping DNS server 127.0.0.1:29507 (tcp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.518003 [INFO] agent: Stopping DNS server 127.0.0.1:29507 (udp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.518183 [INFO] agent: Stopping HTTP server 127.0.0.1:29508 (tcp)
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.518839 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommandJoin_lan - 2019/12/06 06:36:34.518936 [INFO] agent: Endpoints down
--- PASS: TestJoinCommandJoin_lan (8.04s)
TestJoinCommand_wan - 2019/12/06 06:36:34.567049 [INFO] manager: shutting down
TestJoinCommand_wan - 2019/12/06 06:36:34.568069 [INFO] agent: consul server down
TestJoinCommand_wan - 2019/12/06 06:36:34.568151 [INFO] agent: shutdown complete
TestJoinCommand_wan - 2019/12/06 06:36:34.568211 [INFO] agent: Stopping DNS server 127.0.0.1:29501 (tcp)
TestJoinCommand_wan - 2019/12/06 06:36:34.568359 [INFO] agent: Stopping DNS server 127.0.0.1:29501 (udp)
TestJoinCommand_wan - 2019/12/06 06:36:34.568529 [INFO] agent: Stopping HTTP server 127.0.0.1:29502 (tcp)
TestJoinCommand_wan - 2019/12/06 06:36:34.569055 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommand_wan - 2019/12/06 06:36:34.569147 [INFO] agent: Endpoints down
--- PASS: TestJoinCommand_wan (8.09s)
PASS
ok  	github.com/hashicorp/consul/command/join	8.611s
=== RUN   TestKeygenCommand_noTabs
=== PAUSE TestKeygenCommand_noTabs
=== RUN   TestKeygenCommand
=== PAUSE TestKeygenCommand
=== CONT  TestKeygenCommand_noTabs
--- PASS: TestKeygenCommand_noTabs (0.00s)
=== CONT  TestKeygenCommand
--- PASS: TestKeygenCommand (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/keygen	0.113s
=== RUN   TestKeyringCommand_noTabs
=== PAUSE TestKeyringCommand_noTabs
=== RUN   TestKeyringCommand
=== PAUSE TestKeyringCommand
=== RUN   TestKeyringCommand_help
=== PAUSE TestKeyringCommand_help
=== RUN   TestKeyringCommand_failedConnection
=== PAUSE TestKeyringCommand_failedConnection
=== RUN   TestKeyringCommand_invalidRelayFactor
=== PAUSE TestKeyringCommand_invalidRelayFactor
=== CONT  TestKeyringCommand_noTabs
=== CONT  TestKeyringCommand_failedConnection
--- PASS: TestKeyringCommand_noTabs (0.00s)
=== CONT  TestKeyringCommand_help
=== CONT  TestKeyringCommand_invalidRelayFactor
=== CONT  TestKeyringCommand
--- PASS: TestKeyringCommand_invalidRelayFactor (0.00s)
--- PASS: TestKeyringCommand_failedConnection (0.01s)
--- PASS: TestKeyringCommand_help (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestKeyringCommand - 2019/12/06 06:36:40.821366 [WARN] agent: Node name "Node e572a91e-a68d-ffca-de57-c383af65b033" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKeyringCommand - 2019/12/06 06:36:40.826849 [DEBUG] tlsutil: Update with version 1
TestKeyringCommand - 2019/12/06 06:36:40.835397 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:36:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e572a91e-a68d-ffca-de57-c383af65b033 Address:127.0.0.1:14506}]
2019/12/06 06:36:42 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestKeyringCommand - 2019/12/06 06:36:42.339117 [INFO] serf: EventMemberJoin: Node e572a91e-a68d-ffca-de57-c383af65b033.dc1 127.0.0.1
TestKeyringCommand - 2019/12/06 06:36:42.345454 [INFO] serf: EventMemberJoin: Node e572a91e-a68d-ffca-de57-c383af65b033 127.0.0.1
TestKeyringCommand - 2019/12/06 06:36:42.349415 [INFO] consul: Adding LAN server Node e572a91e-a68d-ffca-de57-c383af65b033 (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestKeyringCommand - 2019/12/06 06:36:42.350111 [INFO] consul: Handled member-join event for server "Node e572a91e-a68d-ffca-de57-c383af65b033.dc1" in area "wan"
TestKeyringCommand - 2019/12/06 06:36:42.352164 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestKeyringCommand - 2019/12/06 06:36:42.352854 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestKeyringCommand - 2019/12/06 06:36:42.355400 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestKeyringCommand - 2019/12/06 06:36:42.355583 [INFO] agent: started state syncer
2019/12/06 06:36:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:42 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/06 06:36:42 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:42 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestKeyringCommand - 2019/12/06 06:36:42.800673 [INFO] consul: cluster leadership acquired
TestKeyringCommand - 2019/12/06 06:36:42.801260 [INFO] consul: New leader elected: Node e572a91e-a68d-ffca-de57-c383af65b033
TestKeyringCommand - 2019/12/06 06:36:42.935350 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/06 06:36:42.937154 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033.dc1
TestKeyringCommand - 2019/12/06 06:36:42.937580 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKeyringCommand - 2019/12/06 06:36:42.940829 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/06 06:36:42.942536 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033
TestKeyringCommand - 2019/12/06 06:36:42.945600 [DEBUG] http: Request GET /v1/operator/keyring (11.068259ms) from=127.0.0.1:40874
TestKeyringCommand - 2019/12/06 06:36:42.963199 [INFO] serf: Received install-key query
TestKeyringCommand - 2019/12/06 06:36:43.148704 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033.dc1
TestKeyringCommand - 2019/12/06 06:36:43.150519 [INFO] serf: Received install-key query
TestKeyringCommand - 2019/12/06 06:36:43.153255 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033
TestKeyringCommand - 2019/12/06 06:36:43.154313 [DEBUG] http: Request POST /v1/operator/keyring (192.191508ms) from=127.0.0.1:40876
TestKeyringCommand - 2019/12/06 06:36:43.159939 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/06 06:36:43.162587 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033.dc1
TestKeyringCommand - 2019/12/06 06:36:43.163955 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/06 06:36:43.165760 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033
TestKeyringCommand - 2019/12/06 06:36:43.168423 [DEBUG] http: Request GET /v1/operator/keyring (9.182549ms) from=127.0.0.1:40878
TestKeyringCommand - 2019/12/06 06:36:43.180092 [INFO] serf: Received use-key query
TestKeyringCommand - 2019/12/06 06:36:43.182806 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033.dc1
TestKeyringCommand - 2019/12/06 06:36:43.184658 [INFO] serf: Received use-key query
TestKeyringCommand - 2019/12/06 06:36:43.186772 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033
TestKeyringCommand - 2019/12/06 06:36:43.187747 [DEBUG] http: Request PUT /v1/operator/keyring (8.78054ms) from=127.0.0.1:40880
TestKeyringCommand - 2019/12/06 06:36:43.203986 [INFO] serf: Received remove-key query
TestKeyringCommand - 2019/12/06 06:36:43.211440 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033.dc1
TestKeyringCommand - 2019/12/06 06:36:43.230424 [INFO] serf: Received remove-key query
TestKeyringCommand - 2019/12/06 06:36:43.233253 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033
TestKeyringCommand - 2019/12/06 06:36:43.242491 [DEBUG] http: Request DELETE /v1/operator/keyring (39.575595ms) from=127.0.0.1:40882
TestKeyringCommand - 2019/12/06 06:36:43.281404 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/06 06:36:43.283557 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033.dc1
TestKeyringCommand - 2019/12/06 06:36:43.284999 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/12/06 06:36:43.287140 [DEBUG] serf: messageQueryResponseType: Node e572a91e-a68d-ffca-de57-c383af65b033
TestKeyringCommand - 2019/12/06 06:36:43.291678 [DEBUG] http: Request GET /v1/operator/keyring (11.512603ms) from=127.0.0.1:40884
TestKeyringCommand - 2019/12/06 06:36:43.294108 [INFO] agent: Requesting shutdown
TestKeyringCommand - 2019/12/06 06:36:43.294277 [INFO] consul: shutting down server
TestKeyringCommand - 2019/12/06 06:36:43.294329 [WARN] serf: Shutdown without a Leave
TestKeyringCommand - 2019/12/06 06:36:43.825167 [WARN] serf: Shutdown without a Leave
TestKeyringCommand - 2019/12/06 06:36:43.826891 [INFO] agent: Synced node info
TestKeyringCommand - 2019/12/06 06:36:43.827021 [DEBUG] agent: Node info in sync
TestKeyringCommand - 2019/12/06 06:36:43.900269 [INFO] manager: shutting down
TestKeyringCommand - 2019/12/06 06:36:44.650291 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKeyringCommand - 2019/12/06 06:36:44.650986 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKeyringCommand - 2019/12/06 06:36:44.651618 [INFO] agent: consul server down
TestKeyringCommand - 2019/12/06 06:36:44.651667 [INFO] agent: shutdown complete
TestKeyringCommand - 2019/12/06 06:36:44.651718 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestKeyringCommand - 2019/12/06 06:36:44.651854 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestKeyringCommand - 2019/12/06 06:36:44.651994 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestKeyringCommand - 2019/12/06 06:36:44.653463 [INFO] agent: Waiting for endpoints to shut down
TestKeyringCommand - 2019/12/06 06:36:44.653674 [INFO] agent: Endpoints down
--- PASS: TestKeyringCommand (3.94s)
PASS
ok  	github.com/hashicorp/consul/command/keyring	4.246s
=== RUN   TestKVCommand_noTabs
=== PAUSE TestKVCommand_noTabs
=== CONT  TestKVCommand_noTabs
--- PASS: TestKVCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/kv	0.043s
=== RUN   TestKVDeleteCommand_noTabs
=== PAUSE TestKVDeleteCommand_noTabs
=== RUN   TestKVDeleteCommand_Validation
=== PAUSE TestKVDeleteCommand_Validation
=== RUN   TestKVDeleteCommand
=== PAUSE TestKVDeleteCommand
=== RUN   TestKVDeleteCommand_Recurse
=== PAUSE TestKVDeleteCommand_Recurse
=== RUN   TestKVDeleteCommand_CAS
=== PAUSE TestKVDeleteCommand_CAS
=== CONT  TestKVDeleteCommand_noTabs
=== CONT  TestKVDeleteCommand
=== CONT  TestKVDeleteCommand_CAS
=== CONT  TestKVDeleteCommand_Recurse
=== CONT  TestKVDeleteCommand_Validation
--- PASS: TestKVDeleteCommand_noTabs (0.03s)
--- PASS: TestKVDeleteCommand_Validation (0.03s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:48.188781 [WARN] agent: Node name "Node 39e02ac2-f56a-1f0d-2aff-c8455ea1a6d6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:48.199106 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVDeleteCommand_CAS - 2019/12/06 06:36:48.203364 [WARN] agent: Node name "Node 0f05ad82-7592-7ede-e6a4-9ca469305ead" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand_CAS - 2019/12/06 06:36:48.204539 [DEBUG] tlsutil: Update with version 1
TestKVDeleteCommand_CAS - 2019/12/06 06:36:48.216378 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:48.216375 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVDeleteCommand - 2019/12/06 06:36:48.224058 [WARN] agent: Node name "Node 77eb7dba-2803-6973-bd9b-3cc5337cecfb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand - 2019/12/06 06:36:48.224633 [DEBUG] tlsutil: Update with version 1
TestKVDeleteCommand - 2019/12/06 06:36:48.232332 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:36:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:39e02ac2-f56a-1f0d-2aff-c8455ea1a6d6 Address:127.0.0.1:29512}]
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29512 [Follower] entering Follower state (Leader: "")
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.097165 [INFO] serf: EventMemberJoin: Node 39e02ac2-f56a-1f0d-2aff-c8455ea1a6d6.dc1 127.0.0.1
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.100690 [INFO] serf: EventMemberJoin: Node 39e02ac2-f56a-1f0d-2aff-c8455ea1a6d6 127.0.0.1
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.101798 [INFO] consul: Adding LAN server Node 39e02ac2-f56a-1f0d-2aff-c8455ea1a6d6 (Addr: tcp/127.0.0.1:29512) (DC: dc1)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.102477 [INFO] consul: Handled member-join event for server "Node 39e02ac2-f56a-1f0d-2aff-c8455ea1a6d6.dc1" in area "wan"
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.103051 [INFO] agent: Started DNS server 127.0.0.1:29507 (udp)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.103130 [INFO] agent: Started DNS server 127.0.0.1:29507 (tcp)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.106643 [INFO] agent: Started HTTP server on 127.0.0.1:29508 (tcp)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.106770 [INFO] agent: started state syncer
2019/12/06 06:36:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29512 [Candidate] entering Candidate state in term 2
2019/12/06 06:36:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0f05ad82-7592-7ede-e6a4-9ca469305ead Address:127.0.0.1:29518}]
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29518 [Follower] entering Follower state (Leader: "")
2019/12/06 06:36:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:77eb7dba-2803-6973-bd9b-3cc5337cecfb Address:127.0.0.1:29506}]
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29506 [Follower] entering Follower state (Leader: "")
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.238511 [INFO] serf: EventMemberJoin: Node 0f05ad82-7592-7ede-e6a4-9ca469305ead.dc1 127.0.0.1
TestKVDeleteCommand - 2019/12/06 06:36:49.238885 [INFO] serf: EventMemberJoin: Node 77eb7dba-2803-6973-bd9b-3cc5337cecfb.dc1 127.0.0.1
TestKVDeleteCommand - 2019/12/06 06:36:49.242721 [INFO] serf: EventMemberJoin: Node 77eb7dba-2803-6973-bd9b-3cc5337cecfb 127.0.0.1
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.242892 [INFO] serf: EventMemberJoin: Node 0f05ad82-7592-7ede-e6a4-9ca469305ead 127.0.0.1
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.244528 [INFO] consul: Handled member-join event for server "Node 0f05ad82-7592-7ede-e6a4-9ca469305ead.dc1" in area "wan"
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.244821 [INFO] consul: Adding LAN server Node 0f05ad82-7592-7ede-e6a4-9ca469305ead (Addr: tcp/127.0.0.1:29518) (DC: dc1)
TestKVDeleteCommand - 2019/12/06 06:36:49.245020 [INFO] consul: Handled member-join event for server "Node 77eb7dba-2803-6973-bd9b-3cc5337cecfb.dc1" in area "wan"
TestKVDeleteCommand - 2019/12/06 06:36:49.245245 [INFO] consul: Adding LAN server Node 77eb7dba-2803-6973-bd9b-3cc5337cecfb (Addr: tcp/127.0.0.1:29506) (DC: dc1)
2019/12/06 06:36:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29518 [Candidate] entering Candidate state in term 2
TestKVDeleteCommand - 2019/12/06 06:36:49.275437 [INFO] agent: Started DNS server 127.0.0.1:29501 (udp)
TestKVDeleteCommand - 2019/12/06 06:36:49.275747 [INFO] agent: Started DNS server 127.0.0.1:29501 (tcp)
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.275834 [INFO] agent: Started DNS server 127.0.0.1:29513 (udp)
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.275895 [INFO] agent: Started DNS server 127.0.0.1:29513 (tcp)
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.278222 [INFO] agent: Started HTTP server on 127.0.0.1:29514 (tcp)
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.278330 [INFO] agent: started state syncer
TestKVDeleteCommand - 2019/12/06 06:36:49.278438 [INFO] agent: Started HTTP server on 127.0.0.1:29502 (tcp)
TestKVDeleteCommand - 2019/12/06 06:36:49.278634 [INFO] agent: started state syncer
2019/12/06 06:36:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29506 [Candidate] entering Candidate state in term 2
2019/12/06 06:36:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29512 [Leader] entering Leader state
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.742911 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:49.743478 [INFO] consul: New leader elected: Node 39e02ac2-f56a-1f0d-2aff-c8455ea1a6d6
2019/12/06 06:36:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29506 [Leader] entering Leader state
TestKVDeleteCommand - 2019/12/06 06:36:49.834665 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand - 2019/12/06 06:36:49.835168 [INFO] consul: New leader elected: Node 77eb7dba-2803-6973-bd9b-3cc5337cecfb
2019/12/06 06:36:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:49 [INFO]  raft: Node at 127.0.0.1:29518 [Leader] entering Leader state
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.974023 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand_CAS - 2019/12/06 06:36:49.974584 [INFO] consul: New leader elected: Node 0f05ad82-7592-7ede-e6a4-9ca469305ead
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:50.386880 [DEBUG] http: Request PUT /v1/kv/foo/a (354.273976ms) from=127.0.0.1:40982
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:50.389146 [INFO] agent: Synced node info
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:50.389346 [DEBUG] agent: Node info in sync
TestKVDeleteCommand - 2019/12/06 06:36:50.459852 [INFO] agent: Synced node info
TestKVDeleteCommand_CAS - 2019/12/06 06:36:50.459852 [INFO] agent: Synced node info
TestKVDeleteCommand_CAS - 2019/12/06 06:36:50.460119 [DEBUG] agent: Node info in sync
TestKVDeleteCommand_CAS - 2019/12/06 06:36:50.460645 [DEBUG] http: Request PUT /v1/kv/foo (480.00526ms) from=127.0.0.1:57966
TestKVDeleteCommand - 2019/12/06 06:36:50.463449 [DEBUG] http: Request PUT /v1/kv/foo (482.321647ms) from=127.0.0.1:32968
TestKVDeleteCommand - 2019/12/06 06:36:50.813664 [DEBUG] agent: Node info in sync
TestKVDeleteCommand - 2019/12/06 06:36:50.813787 [DEBUG] agent: Node info in sync
TestKVDeleteCommand_CAS - 2019/12/06 06:36:50.893542 [DEBUG] http: Request DELETE /v1/kv/foo?cas=1 (425.945658ms) from=127.0.0.1:57970
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:50.906369 [DEBUG] http: Request PUT /v1/kv/foo/b (517.519806ms) from=127.0.0.1:40982
TestKVDeleteCommand_CAS - 2019/12/06 06:36:50.931640 [DEBUG] http: Request GET /v1/kv/foo (21.18283ms) from=127.0.0.1:57966
TestKVDeleteCommand - 2019/12/06 06:36:51.071450 [DEBUG] http: Request DELETE /v1/kv/foo (603.986835ms) from=127.0.0.1:32974
TestKVDeleteCommand - 2019/12/06 06:36:51.073835 [DEBUG] http: Request GET /v1/kv/foo (243.339µs) from=127.0.0.1:32968
TestKVDeleteCommand - 2019/12/06 06:36:51.074927 [INFO] agent: Requesting shutdown
TestKVDeleteCommand - 2019/12/06 06:36:51.075005 [INFO] consul: shutting down server
TestKVDeleteCommand - 2019/12/06 06:36:51.075055 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand - 2019/12/06 06:36:51.158755 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand - 2019/12/06 06:36:51.267050 [INFO] manager: shutting down
TestKVDeleteCommand - 2019/12/06 06:36:51.340497 [INFO] agent: consul server down
TestKVDeleteCommand - 2019/12/06 06:36:51.340896 [INFO] agent: shutdown complete
TestKVDeleteCommand - 2019/12/06 06:36:51.341138 [INFO] agent: Stopping DNS server 127.0.0.1:29501 (tcp)
TestKVDeleteCommand - 2019/12/06 06:36:51.341513 [INFO] agent: Stopping DNS server 127.0.0.1:29501 (udp)
TestKVDeleteCommand - 2019/12/06 06:36:51.341862 [INFO] agent: Stopping HTTP server 127.0.0.1:29502 (tcp)
TestKVDeleteCommand - 2019/12/06 06:36:51.340754 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.343300 [DEBUG] http: Request DELETE /v1/kv/foo?cas=4 (406.55987ms) from=127.0.0.1:57974
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.346900 [DEBUG] http: Request GET /v1/kv/foo (230.338µs) from=127.0.0.1:57966
TestKVDeleteCommand - 2019/12/06 06:36:51.343888 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand - 2019/12/06 06:36:51.347310 [INFO] agent: Endpoints down
--- PASS: TestKVDeleteCommand (3.35s)
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.348166 [INFO] agent: Requesting shutdown
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.348240 [INFO] consul: shutting down server
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.348282 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.416951 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.423110 [DEBUG] http: Request PUT /v1/kv/food (498.888703ms) from=127.0.0.1:40982
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.471258 [DEBUG] agent: Node info in sync
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.508694 [INFO] manager: shutting down
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.567242 [INFO] agent: consul server down
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.570096 [INFO] agent: shutdown complete
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.567585 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.570936 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.571148 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.571324 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.571480 [ERR] consul: failed to transfer leadership in 3 attempts
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.570496 [INFO] agent: Stopping DNS server 127.0.0.1:29513 (tcp)
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.572814 [INFO] agent: Stopping DNS server 127.0.0.1:29513 (udp)
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.573201 [INFO] agent: Stopping HTTP server 127.0.0.1:29514 (tcp)
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.574587 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand_CAS - 2019/12/06 06:36:51.574925 [INFO] agent: Endpoints down
--- PASS: TestKVDeleteCommand_CAS (3.58s)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.802946 [DEBUG] http: Request DELETE /v1/kv/foo?recurse= (372.355734ms) from=127.0.0.1:40990
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.805465 [DEBUG] http: Request GET /v1/kv/foo/a (211.338µs) from=127.0.0.1:40982
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.806775 [DEBUG] http: Request GET /v1/kv/foo/b (165.67µs) from=127.0.0.1:40982
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.808608 [DEBUG] http: Request GET /v1/kv/food (283.007µs) from=127.0.0.1:40982
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.809285 [INFO] agent: Requesting shutdown
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.809356 [INFO] consul: shutting down server
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.809398 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:51.892593 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.025411 [INFO] manager: shutting down
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.083983 [INFO] agent: consul server down
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.084066 [INFO] agent: shutdown complete
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.084122 [INFO] agent: Stopping DNS server 127.0.0.1:29507 (tcp)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.084355 [INFO] agent: Stopping DNS server 127.0.0.1:29507 (udp)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.084506 [INFO] agent: Stopping HTTP server 127.0.0.1:29508 (tcp)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.085155 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.085276 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.085458 [INFO] agent: Endpoints down
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.085514 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
--- PASS: TestKVDeleteCommand_Recurse (4.09s)
TestKVDeleteCommand_Recurse - 2019/12/06 06:36:52.085578 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
PASS
ok  	github.com/hashicorp/consul/command/kv/del	4.351s
=== RUN   TestKVExportCommand_noTabs
=== PAUSE TestKVExportCommand_noTabs
=== RUN   TestKVExportCommand
=== PAUSE TestKVExportCommand
=== CONT  TestKVExportCommand_noTabs
--- PASS: TestKVExportCommand_noTabs (0.00s)
=== CONT  TestKVExportCommand
WARNING: bootstrap = true: do not enable unless necessary
TestKVExportCommand - 2019/12/06 06:36:52.395494 [WARN] agent: Node name "Node 7cb2d959-b228-a169-b88f-832d82cf20c0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVExportCommand - 2019/12/06 06:36:52.396417 [DEBUG] tlsutil: Update with version 1
TestKVExportCommand - 2019/12/06 06:36:52.403678 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:36:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7cb2d959-b228-a169-b88f-832d82cf20c0 Address:127.0.0.1:52006}]
2019/12/06 06:36:53 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestKVExportCommand - 2019/12/06 06:36:53.232623 [INFO] serf: EventMemberJoin: Node 7cb2d959-b228-a169-b88f-832d82cf20c0.dc1 127.0.0.1
TestKVExportCommand - 2019/12/06 06:36:53.254446 [INFO] serf: EventMemberJoin: Node 7cb2d959-b228-a169-b88f-832d82cf20c0 127.0.0.1
TestKVExportCommand - 2019/12/06 06:36:53.258364 [INFO] consul: Adding LAN server Node 7cb2d959-b228-a169-b88f-832d82cf20c0 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestKVExportCommand - 2019/12/06 06:36:53.258505 [INFO] consul: Handled member-join event for server "Node 7cb2d959-b228-a169-b88f-832d82cf20c0.dc1" in area "wan"
TestKVExportCommand - 2019/12/06 06:36:53.260971 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestKVExportCommand - 2019/12/06 06:36:53.261565 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestKVExportCommand - 2019/12/06 06:36:53.265704 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestKVExportCommand - 2019/12/06 06:36:53.266105 [INFO] agent: started state syncer
2019/12/06 06:36:53 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:36:53 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/12/06 06:36:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:36:53 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestKVExportCommand - 2019/12/06 06:36:53.675856 [INFO] consul: cluster leadership acquired
TestKVExportCommand - 2019/12/06 06:36:53.676390 [INFO] consul: New leader elected: Node 7cb2d959-b228-a169-b88f-832d82cf20c0
TestKVExportCommand - 2019/12/06 06:36:53.952981 [INFO] agent: Synced node info
TestKVExportCommand - 2019/12/06 06:36:53.953765 [DEBUG] http: Request PUT /v1/kv/foo/a (249.79986ms) from=127.0.0.1:37190
TestKVExportCommand - 2019/12/06 06:36:54.565245 [DEBUG] http: Request PUT /v1/kv/foo/b (607.768256ms) from=127.0.0.1:37190
TestKVExportCommand - 2019/12/06 06:36:54.894717 [DEBUG] http: Request PUT /v1/kv/foo/c (326.925002ms) from=127.0.0.1:37190
TestKVExportCommand - 2019/12/06 06:36:55.210599 [DEBUG] http: Request PUT /v1/kv/bar (313.660024ms) from=127.0.0.1:37190
TestKVExportCommand - 2019/12/06 06:36:55.215748 [DEBUG] http: Request GET /v1/kv/foo?recurse= (1.554704ms) from=127.0.0.1:37192
TestKVExportCommand - 2019/12/06 06:36:55.218015 [INFO] agent: Requesting shutdown
TestKVExportCommand - 2019/12/06 06:36:55.218129 [INFO] consul: shutting down server
TestKVExportCommand - 2019/12/06 06:36:55.218179 [WARN] serf: Shutdown without a Leave
TestKVExportCommand - 2019/12/06 06:36:55.358753 [WARN] serf: Shutdown without a Leave
TestKVExportCommand - 2019/12/06 06:36:55.409760 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVExportCommand - 2019/12/06 06:36:55.410276 [DEBUG] consul: Skipping self join check for "Node 7cb2d959-b228-a169-b88f-832d82cf20c0" since the cluster is too small
TestKVExportCommand - 2019/12/06 06:36:55.410441 [INFO] consul: member 'Node 7cb2d959-b228-a169-b88f-832d82cf20c0' joined, marking health alive
TestKVExportCommand - 2019/12/06 06:36:55.483891 [INFO] manager: shutting down
TestKVExportCommand - 2019/12/06 06:36:55.617391 [INFO] agent: consul server down
TestKVExportCommand - 2019/12/06 06:36:55.617469 [INFO] agent: shutdown complete
TestKVExportCommand - 2019/12/06 06:36:55.617527 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestKVExportCommand - 2019/12/06 06:36:55.617391 [ERR] consul: failed to reconcile member: {Node 7cb2d959-b228-a169-b88f-832d82cf20c0 127.0.0.1 52004 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:7cb2d959-b228-a169-b88f-832d82cf20c0 port:52006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:52005] alive 1 5 2 2 5 4}: leadership lost while committing log
TestKVExportCommand - 2019/12/06 06:36:55.617675 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestKVExportCommand - 2019/12/06 06:36:55.617827 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestKVExportCommand - 2019/12/06 06:36:55.618532 [INFO] agent: Waiting for endpoints to shut down
TestKVExportCommand - 2019/12/06 06:36:55.618628 [INFO] agent: Endpoints down
--- PASS: TestKVExportCommand (3.38s)
PASS
ok  	github.com/hashicorp/consul/command/kv/exp	3.755s
=== RUN   TestKVGetCommand_noTabs
=== PAUSE TestKVGetCommand_noTabs
=== RUN   TestKVGetCommand_Validation
=== PAUSE TestKVGetCommand_Validation
=== RUN   TestKVGetCommand
=== PAUSE TestKVGetCommand
=== RUN   TestKVGetCommand_Base64
=== PAUSE TestKVGetCommand_Base64
=== RUN   TestKVGetCommand_Missing
=== PAUSE TestKVGetCommand_Missing
=== RUN   TestKVGetCommand_Empty
=== PAUSE TestKVGetCommand_Empty
=== RUN   TestKVGetCommand_Detailed
=== PAUSE TestKVGetCommand_Detailed
=== RUN   TestKVGetCommand_Keys
=== PAUSE TestKVGetCommand_Keys
=== RUN   TestKVGetCommand_Recurse
=== PAUSE TestKVGetCommand_Recurse
=== RUN   TestKVGetCommand_RecurseBase64
=== PAUSE TestKVGetCommand_RecurseBase64
=== RUN   TestKVGetCommand_DetailedBase64
--- SKIP: TestKVGetCommand_DetailedBase64 (0.00s)
    kv_get_test.go:338: DM-skipped
=== CONT  TestKVGetCommand_noTabs
--- PASS: TestKVGetCommand_noTabs (0.00s)
=== CONT  TestKVGetCommand_RecurseBase64
=== CONT  TestKVGetCommand_Empty
=== CONT  TestKVGetCommand_Base64
=== CONT  TestKVGetCommand_Recurse
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Empty - 2019/12/06 06:37:22.460038 [WARN] agent: Node name "Node c0653e07-c647-1d63-8e96-c14f757ff822" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Empty - 2019/12/06 06:37:22.461219 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Recurse - 2019/12/06 06:37:22.465546 [WARN] agent: Node name "Node be1c64f1-bc29-fba7-e9d9-254ad893b071" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Recurse - 2019/12/06 06:37:22.465978 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Empty - 2019/12/06 06:37:22.476987 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Recurse - 2019/12/06 06:37:22.476987 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:22.492365 [WARN] agent: Node name "Node 2e341d63-d2cc-5217-829a-8430dee883e3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:22.500536 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:22.510918 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Base64 - 2019/12/06 06:37:22.553571 [WARN] agent: Node name "Node 6d75f701-87a5-be0d-1754-70c299dbfa88" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Base64 - 2019/12/06 06:37:22.554026 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Base64 - 2019/12/06 06:37:22.562089 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:37:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:be1c64f1-bc29-fba7-e9d9-254ad893b071 Address:127.0.0.1:14518}]
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14518 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.132808 [INFO] serf: EventMemberJoin: Node be1c64f1-bc29-fba7-e9d9-254ad893b071.dc1 127.0.0.1
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.138135 [INFO] serf: EventMemberJoin: Node be1c64f1-bc29-fba7-e9d9-254ad893b071 127.0.0.1
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.139889 [INFO] consul: Adding LAN server Node be1c64f1-bc29-fba7-e9d9-254ad893b071 (Addr: tcp/127.0.0.1:14518) (DC: dc1)
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.140329 [INFO] consul: Handled member-join event for server "Node be1c64f1-bc29-fba7-e9d9-254ad893b071.dc1" in area "wan"
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.142230 [INFO] agent: Started DNS server 127.0.0.1:14513 (tcp)
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.143001 [INFO] agent: Started DNS server 127.0.0.1:14513 (udp)
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.145825 [INFO] agent: Started HTTP server on 127.0.0.1:14514 (tcp)
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.146187 [INFO] agent: started state syncer
2019/12/06 06:37:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14518 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c0653e07-c647-1d63-8e96-c14f757ff822 Address:127.0.0.1:14506}]
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
2019/12/06 06:37:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2e341d63-d2cc-5217-829a-8430dee883e3 Address:127.0.0.1:14524}]
TestKVGetCommand_Empty - 2019/12/06 06:37:24.230660 [INFO] serf: EventMemberJoin: Node c0653e07-c647-1d63-8e96-c14f757ff822.dc1 127.0.0.1
2019/12/06 06:37:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6d75f701-87a5-be0d-1754-70c299dbfa88 Address:127.0.0.1:14512}]
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14512 [Follower] entering Follower state (Leader: "")
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14524 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Empty - 2019/12/06 06:37:24.234663 [INFO] serf: EventMemberJoin: Node c0653e07-c647-1d63-8e96-c14f757ff822 127.0.0.1
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.237960 [INFO] serf: EventMemberJoin: Node 2e341d63-d2cc-5217-829a-8430dee883e3.dc1 127.0.0.1
TestKVGetCommand_Empty - 2019/12/06 06:37:24.240277 [INFO] consul: Adding LAN server Node c0653e07-c647-1d63-8e96-c14f757ff822 (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestKVGetCommand_Empty - 2019/12/06 06:37:24.240836 [INFO] consul: Handled member-join event for server "Node c0653e07-c647-1d63-8e96-c14f757ff822.dc1" in area "wan"
TestKVGetCommand_Empty - 2019/12/06 06:37:24.260348 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestKVGetCommand_Empty - 2019/12/06 06:37:24.260758 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestKVGetCommand_Base64 - 2019/12/06 06:37:24.261650 [INFO] serf: EventMemberJoin: Node 6d75f701-87a5-be0d-1754-70c299dbfa88.dc1 127.0.0.1
TestKVGetCommand_Empty - 2019/12/06 06:37:24.268135 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestKVGetCommand_Empty - 2019/12/06 06:37:24.279857 [INFO] agent: started state syncer
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.278128 [INFO] serf: EventMemberJoin: Node 2e341d63-d2cc-5217-829a-8430dee883e3 127.0.0.1
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.283890 [INFO] agent: Started DNS server 127.0.0.1:14519 (udp)
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.285417 [INFO] consul: Handled member-join event for server "Node 2e341d63-d2cc-5217-829a-8430dee883e3.dc1" in area "wan"
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.290508 [INFO] consul: Adding LAN server Node 2e341d63-d2cc-5217-829a-8430dee883e3 (Addr: tcp/127.0.0.1:14524) (DC: dc1)
2019/12/06 06:37:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14524 [Candidate] entering Candidate state in term 2
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.310850 [INFO] agent: Started DNS server 127.0.0.1:14519 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.313497 [INFO] agent: Started HTTP server on 127.0.0.1:14520 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.313609 [INFO] agent: started state syncer
2019/12/06 06:37:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14512 [Candidate] entering Candidate state in term 2
TestKVGetCommand_Base64 - 2019/12/06 06:37:24.323194 [INFO] serf: EventMemberJoin: Node 6d75f701-87a5-be0d-1754-70c299dbfa88 127.0.0.1
TestKVGetCommand_Base64 - 2019/12/06 06:37:24.325196 [INFO] consul: Adding LAN server Node 6d75f701-87a5-be0d-1754-70c299dbfa88 (Addr: tcp/127.0.0.1:14512) (DC: dc1)
TestKVGetCommand_Base64 - 2019/12/06 06:37:24.326051 [INFO] consul: Handled member-join event for server "Node 6d75f701-87a5-be0d-1754-70c299dbfa88.dc1" in area "wan"
TestKVGetCommand_Base64 - 2019/12/06 06:37:24.327682 [INFO] agent: Started DNS server 127.0.0.1:14507 (tcp)
TestKVGetCommand_Base64 - 2019/12/06 06:37:24.328118 [INFO] agent: Started DNS server 127.0.0.1:14507 (udp)
TestKVGetCommand_Base64 - 2019/12/06 06:37:24.330934 [INFO] agent: Started HTTP server on 127.0.0.1:14508 (tcp)
TestKVGetCommand_Base64 - 2019/12/06 06:37:24.331062 [INFO] agent: started state syncer
2019/12/06 06:37:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14524 [Leader] entering Leader state
2019/12/06 06:37:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:24 [INFO]  raft: Node at 127.0.0.1:14518 [Leader] entering Leader state
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.956150 [INFO] consul: cluster leadership acquired
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.956265 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Recurse - 2019/12/06 06:37:24.956766 [INFO] consul: New leader elected: Node be1c64f1-bc29-fba7-e9d9-254ad893b071
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:24.956874 [INFO] consul: New leader elected: Node 2e341d63-d2cc-5217-829a-8430dee883e3
2019/12/06 06:37:25 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:25 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
2019/12/06 06:37:25 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:25 [INFO]  raft: Node at 127.0.0.1:14512 [Leader] entering Leader state
TestKVGetCommand_Empty - 2019/12/06 06:37:25.237577 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Empty - 2019/12/06 06:37:25.238017 [INFO] consul: New leader elected: Node c0653e07-c647-1d63-8e96-c14f757ff822
TestKVGetCommand_Base64 - 2019/12/06 06:37:25.238257 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Base64 - 2019/12/06 06:37:25.238653 [INFO] consul: New leader elected: Node 6d75f701-87a5-be0d-1754-70c299dbfa88
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:25.461015 [INFO] agent: Synced node info
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:25.461532 [DEBUG] agent: Node info in sync
TestKVGetCommand_Recurse - 2019/12/06 06:37:25.461847 [DEBUG] http: Request PUT /v1/kv/foo/a (309.237587ms) from=127.0.0.1:44140
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:25.469716 [DEBUG] http: Request PUT /v1/kv/foo/a (403.424796ms) from=127.0.0.1:46684
TestKVGetCommand_Recurse - 2019/12/06 06:37:25.472653 [INFO] agent: Synced node info
TestKVGetCommand_Empty - 2019/12/06 06:37:25.560643 [INFO] agent: Synced node info
TestKVGetCommand_Empty - 2019/12/06 06:37:25.560790 [DEBUG] agent: Node info in sync
TestKVGetCommand_Empty - 2019/12/06 06:37:25.565585 [DEBUG] http: Request PUT /v1/kv/empty (306.918533ms) from=127.0.0.1:40908
TestKVGetCommand_Empty - 2019/12/06 06:37:25.608396 [DEBUG] http: Request GET /v1/kv/empty (23.16521ms) from=127.0.0.1:40912
TestKVGetCommand_Empty - 2019/12/06 06:37:25.611505 [INFO] agent: Requesting shutdown
TestKVGetCommand_Empty - 2019/12/06 06:37:25.612040 [INFO] consul: shutting down server
TestKVGetCommand_Empty - 2019/12/06 06:37:25.612292 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Base64 - 2019/12/06 06:37:25.661943 [DEBUG] http: Request PUT /v1/kv/foo (342.400698ms) from=127.0.0.1:37044
TestKVGetCommand_Base64 - 2019/12/06 06:37:25.667917 [INFO] agent: Synced node info
TestKVGetCommand_Base64 - 2019/12/06 06:37:25.672808 [DEBUG] http: Request GET /v1/kv/foo (1.522036ms) from=127.0.0.1:37048
TestKVGetCommand_Base64 - 2019/12/06 06:37:25.675720 [INFO] agent: Requesting shutdown
TestKVGetCommand_Base64 - 2019/12/06 06:37:25.675832 [INFO] consul: shutting down server
TestKVGetCommand_Base64 - 2019/12/06 06:37:25.675885 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Empty - 2019/12/06 06:37:25.769392 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Empty - 2019/12/06 06:37:26.311785 [DEBUG] agent: Node info in sync
TestKVGetCommand_Base64 - 2019/12/06 06:37:26.540042 [DEBUG] agent: Node info in sync
TestKVGetCommand_Base64 - 2019/12/06 06:37:26.540166 [DEBUG] agent: Node info in sync
TestKVGetCommand_Empty - 2019/12/06 06:37:26.559409 [INFO] manager: shutting down
TestKVGetCommand_Empty - 2019/12/06 06:37:26.561808 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVGetCommand_Empty - 2019/12/06 06:37:26.562158 [INFO] agent: consul server down
TestKVGetCommand_Empty - 2019/12/06 06:37:26.562209 [INFO] agent: shutdown complete
TestKVGetCommand_Empty - 2019/12/06 06:37:26.562282 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestKVGetCommand_Empty - 2019/12/06 06:37:26.562244 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVGetCommand_Empty - 2019/12/06 06:37:26.562485 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestKVGetCommand_Empty - 2019/12/06 06:37:26.562762 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestKVGetCommand_Base64 - 2019/12/06 06:37:26.562783 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Empty - 2019/12/06 06:37:26.563819 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Empty - 2019/12/06 06:37:26.563911 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Empty (4.31s)
=== CONT  TestKVGetCommand
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand - 2019/12/06 06:37:26.640390 [WARN] agent: Node name "Node 30fab450-8e8f-6cac-b2e7-ed1d6ae89ce7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand - 2019/12/06 06:37:26.640935 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand - 2019/12/06 06:37:26.643143 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Base64 - 2019/12/06 06:37:26.736328 [INFO] manager: shutting down
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:26.945291 [DEBUG] http: Request PUT /v1/kv/foo/b (1.472985218s) from=127.0.0.1:46684
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.043769 [INFO] agent: consul server down
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.043865 [INFO] agent: shutdown complete
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.043932 [INFO] agent: Stopping DNS server 127.0.0.1:14507 (tcp)
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.044153 [INFO] agent: Stopping DNS server 127.0.0.1:14507 (udp)
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.044431 [INFO] agent: Stopping HTTP server 127.0.0.1:14508 (tcp)
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.045207 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.045487 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.045558 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.046514 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Base64 - 2019/12/06 06:37:27.046759 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Base64 (4.79s)
=== CONT  TestKVGetCommand_Validation
--- PASS: TestKVGetCommand_Validation (0.00s)
=== CONT  TestKVGetCommand_Missing
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Missing - 2019/12/06 06:37:27.118978 [WARN] agent: Node name "Node 68b75ba3-ee2c-aeb9-118e-6148db442877" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Missing - 2019/12/06 06:37:27.119896 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Missing - 2019/12/06 06:37:27.122698 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.232402 [DEBUG] http: Request PUT /v1/kv/foo/b (1.763965376s) from=127.0.0.1:44140
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.356183 [DEBUG] agent: Node info in sync
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.356343 [DEBUG] agent: Node info in sync
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.517672 [DEBUG] http: Request PUT /v1/kv/foo/c (569.611028ms) from=127.0.0.1:46684
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.522763 [DEBUG] http: Request GET /v1/kv/foo?recurse= (1.167361ms) from=127.0.0.1:46696
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.524809 [INFO] agent: Requesting shutdown
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.524926 [INFO] consul: shutting down server
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.524977 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.620085 [DEBUG] agent: Node info in sync
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.625885 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.627410 [DEBUG] http: Request PUT /v1/kv/foo/c (392.569875ms) from=127.0.0.1:44140
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.634047 [DEBUG] http: Request GET /v1/kv/foo?recurse= (2.13005ms) from=127.0.0.1:44152
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.640151 [INFO] agent: Requesting shutdown
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.640286 [INFO] consul: shutting down server
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.640398 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.751740 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.752436 [INFO] manager: shutting down
2019/12/06 06:37:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:30fab450-8e8f-6cac-b2e7-ed1d6ae89ce7 Address:127.0.0.1:14530}]
2019/12/06 06:37:27 [INFO]  raft: Node at 127.0.0.1:14530 [Follower] entering Follower state (Leader: "")
TestKVGetCommand - 2019/12/06 06:37:27.757195 [INFO] serf: EventMemberJoin: Node 30fab450-8e8f-6cac-b2e7-ed1d6ae89ce7.dc1 127.0.0.1
TestKVGetCommand - 2019/12/06 06:37:27.762853 [INFO] serf: EventMemberJoin: Node 30fab450-8e8f-6cac-b2e7-ed1d6ae89ce7 127.0.0.1
TestKVGetCommand - 2019/12/06 06:37:27.765373 [INFO] consul: Adding LAN server Node 30fab450-8e8f-6cac-b2e7-ed1d6ae89ce7 (Addr: tcp/127.0.0.1:14530) (DC: dc1)
TestKVGetCommand - 2019/12/06 06:37:27.767256 [INFO] consul: Handled member-join event for server "Node 30fab450-8e8f-6cac-b2e7-ed1d6ae89ce7.dc1" in area "wan"
TestKVGetCommand - 2019/12/06 06:37:27.778684 [INFO] agent: Started DNS server 127.0.0.1:14525 (udp)
TestKVGetCommand - 2019/12/06 06:37:27.781598 [INFO] agent: Started DNS server 127.0.0.1:14525 (tcp)
TestKVGetCommand - 2019/12/06 06:37:27.792217 [INFO] agent: Started HTTP server on 127.0.0.1:14526 (tcp)
TestKVGetCommand - 2019/12/06 06:37:27.792480 [INFO] agent: started state syncer
2019/12/06 06:37:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:27 [INFO]  raft: Node at 127.0.0.1:14530 [Candidate] entering Candidate state in term 2
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.860524 [INFO] agent: consul server down
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.860589 [INFO] agent: shutdown complete
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.860649 [INFO] agent: Stopping DNS server 127.0.0.1:14519 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.860790 [INFO] agent: Stopping DNS server 127.0.0.1:14519 (udp)
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.860954 [INFO] agent: Stopping HTTP server 127.0.0.1:14520 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.861678 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.861820 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.862021 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_RecurseBase64 (5.61s)
=== CONT  TestKVGetCommand_Keys
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.865700 [INFO] manager: shutting down
TestKVGetCommand_RecurseBase64 - 2019/12/06 06:37:27.866264 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Keys - 2019/12/06 06:37:27.932905 [WARN] agent: Node name "Node 9fb520de-89e1-93a6-6840-552754fbe2ec" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Keys - 2019/12/06 06:37:27.933475 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935429 [INFO] agent: consul server down
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935503 [INFO] agent: shutdown complete
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935568 [INFO] agent: Stopping DNS server 127.0.0.1:14513 (tcp)
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935631 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935712 [INFO] agent: Stopping DNS server 127.0.0.1:14513 (udp)
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935822 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935851 [INFO] agent: Stopping HTTP server 127.0.0.1:14514 (tcp)
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935876 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935928 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.935980 [ERR] consul: failed to transfer leadership in 3 attempts
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.936527 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Recurse - 2019/12/06 06:37:27.936609 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Recurse (5.67s)
=== CONT  TestKVGetCommand_Detailed
TestKVGetCommand_Keys - 2019/12/06 06:37:27.952261 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Detailed - 2019/12/06 06:37:28.014996 [WARN] agent: Node name "Node 4de63ef8-d5f8-100e-4c8c-353a90274c4c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Detailed - 2019/12/06 06:37:28.015625 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Detailed - 2019/12/06 06:37:28.018290 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:37:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:68b75ba3-ee2c-aeb9-118e-6148db442877 Address:127.0.0.1:14536}]
2019/12/06 06:37:28 [INFO]  raft: Node at 127.0.0.1:14536 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Missing - 2019/12/06 06:37:28.480847 [INFO] serf: EventMemberJoin: Node 68b75ba3-ee2c-aeb9-118e-6148db442877.dc1 127.0.0.1
TestKVGetCommand_Missing - 2019/12/06 06:37:28.484100 [INFO] serf: EventMemberJoin: Node 68b75ba3-ee2c-aeb9-118e-6148db442877 127.0.0.1
TestKVGetCommand_Missing - 2019/12/06 06:37:28.484980 [INFO] consul: Adding LAN server Node 68b75ba3-ee2c-aeb9-118e-6148db442877 (Addr: tcp/127.0.0.1:14536) (DC: dc1)
TestKVGetCommand_Missing - 2019/12/06 06:37:28.485289 [INFO] consul: Handled member-join event for server "Node 68b75ba3-ee2c-aeb9-118e-6148db442877.dc1" in area "wan"
TestKVGetCommand_Missing - 2019/12/06 06:37:28.485375 [INFO] agent: Started DNS server 127.0.0.1:14531 (udp)
TestKVGetCommand_Missing - 2019/12/06 06:37:28.485771 [INFO] agent: Started DNS server 127.0.0.1:14531 (tcp)
TestKVGetCommand_Missing - 2019/12/06 06:37:28.488154 [INFO] agent: Started HTTP server on 127.0.0.1:14532 (tcp)
TestKVGetCommand_Missing - 2019/12/06 06:37:28.488234 [INFO] agent: started state syncer
2019/12/06 06:37:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:28 [INFO]  raft: Node at 127.0.0.1:14536 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:28 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:28 [INFO]  raft: Node at 127.0.0.1:14530 [Leader] entering Leader state
TestKVGetCommand - 2019/12/06 06:37:28.770329 [INFO] consul: cluster leadership acquired
TestKVGetCommand - 2019/12/06 06:37:28.770812 [INFO] consul: New leader elected: Node 30fab450-8e8f-6cac-b2e7-ed1d6ae89ce7
TestKVGetCommand - 2019/12/06 06:37:29.127526 [INFO] agent: Synced node info
2019/12/06 06:37:29 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:29 [INFO]  raft: Node at 127.0.0.1:14536 [Leader] entering Leader state
TestKVGetCommand_Missing - 2019/12/06 06:37:29.128680 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Missing - 2019/12/06 06:37:29.129105 [INFO] consul: New leader elected: Node 68b75ba3-ee2c-aeb9-118e-6148db442877
TestKVGetCommand_Missing - 2019/12/06 06:37:29.151161 [DEBUG] http: Request GET /v1/kv/not-a-real-key (465.344µs) from=127.0.0.1:33434
TestKVGetCommand_Missing - 2019/12/06 06:37:29.151925 [INFO] agent: Requesting shutdown
TestKVGetCommand_Missing - 2019/12/06 06:37:29.151999 [INFO] consul: shutting down server
TestKVGetCommand_Missing - 2019/12/06 06:37:29.152045 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Missing - 2019/12/06 06:37:29.152167 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/06 06:37:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9fb520de-89e1-93a6-6840-552754fbe2ec Address:127.0.0.1:14542}]
2019/12/06 06:37:29 [INFO]  raft: Node at 127.0.0.1:14542 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Keys - 2019/12/06 06:37:29.234657 [INFO] serf: EventMemberJoin: Node 9fb520de-89e1-93a6-6840-552754fbe2ec.dc1 127.0.0.1
TestKVGetCommand_Keys - 2019/12/06 06:37:29.240552 [INFO] serf: EventMemberJoin: Node 9fb520de-89e1-93a6-6840-552754fbe2ec 127.0.0.1
TestKVGetCommand_Keys - 2019/12/06 06:37:29.241577 [INFO] consul: Adding LAN server Node 9fb520de-89e1-93a6-6840-552754fbe2ec (Addr: tcp/127.0.0.1:14542) (DC: dc1)
TestKVGetCommand_Keys - 2019/12/06 06:37:29.242099 [INFO] consul: Handled member-join event for server "Node 9fb520de-89e1-93a6-6840-552754fbe2ec.dc1" in area "wan"
TestKVGetCommand_Keys - 2019/12/06 06:37:29.242995 [INFO] agent: Started DNS server 127.0.0.1:14537 (tcp)
TestKVGetCommand_Keys - 2019/12/06 06:37:29.243614 [INFO] agent: Started DNS server 127.0.0.1:14537 (udp)
TestKVGetCommand_Keys - 2019/12/06 06:37:29.246408 [INFO] agent: Started HTTP server on 127.0.0.1:14538 (tcp)
TestKVGetCommand_Keys - 2019/12/06 06:37:29.246534 [INFO] agent: started state syncer
2019/12/06 06:37:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:29 [INFO]  raft: Node at 127.0.0.1:14542 [Candidate] entering Candidate state in term 2
TestKVGetCommand_Missing - 2019/12/06 06:37:29.317601 [WARN] serf: Shutdown without a Leave
2019/12/06 06:37:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4de63ef8-d5f8-100e-4c8c-353a90274c4c Address:127.0.0.1:14548}]
2019/12/06 06:37:29 [INFO]  raft: Node at 127.0.0.1:14548 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Missing - 2019/12/06 06:37:29.470296 [INFO] manager: shutting down
TestKVGetCommand - 2019/12/06 06:37:29.470342 [DEBUG] http: Request PUT /v1/kv/foo (480.744944ms) from=127.0.0.1:58032
TestKVGetCommand_Detailed - 2019/12/06 06:37:29.474824 [INFO] serf: EventMemberJoin: Node 4de63ef8-d5f8-100e-4c8c-353a90274c4c.dc1 127.0.0.1
TestKVGetCommand - 2019/12/06 06:37:29.477292 [DEBUG] http: Request GET /v1/kv/foo (3.00107ms) from=127.0.0.1:58036
TestKVGetCommand - 2019/12/06 06:37:29.480121 [INFO] agent: Requesting shutdown
TestKVGetCommand - 2019/12/06 06:37:29.480266 [INFO] consul: shutting down server
TestKVGetCommand - 2019/12/06 06:37:29.480330 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Detailed - 2019/12/06 06:37:29.480838 [INFO] serf: EventMemberJoin: Node 4de63ef8-d5f8-100e-4c8c-353a90274c4c 127.0.0.1
TestKVGetCommand_Detailed - 2019/12/06 06:37:29.483215 [INFO] consul: Adding LAN server Node 4de63ef8-d5f8-100e-4c8c-353a90274c4c (Addr: tcp/127.0.0.1:14548) (DC: dc1)
TestKVGetCommand_Detailed - 2019/12/06 06:37:29.483623 [INFO] agent: Started DNS server 127.0.0.1:14543 (udp)
TestKVGetCommand_Detailed - 2019/12/06 06:37:29.484040 [INFO] consul: Handled member-join event for server "Node 4de63ef8-d5f8-100e-4c8c-353a90274c4c.dc1" in area "wan"
TestKVGetCommand_Detailed - 2019/12/06 06:37:29.484747 [INFO] agent: Started DNS server 127.0.0.1:14543 (tcp)
TestKVGetCommand_Detailed - 2019/12/06 06:37:29.487664 [INFO] agent: Started HTTP server on 127.0.0.1:14544 (tcp)
TestKVGetCommand_Detailed - 2019/12/06 06:37:29.487999 [INFO] agent: started state syncer
2019/12/06 06:37:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:29 [INFO]  raft: Node at 127.0.0.1:14548 [Candidate] entering Candidate state in term 2
TestKVGetCommand - 2019/12/06 06:37:29.750939 [WARN] serf: Shutdown without a Leave
TestKVGetCommand - 2019/12/06 06:37:29.942675 [INFO] manager: shutting down
TestKVGetCommand_Missing - 2019/12/06 06:37:29.943215 [INFO] agent: consul server down
TestKVGetCommand_Missing - 2019/12/06 06:37:29.943271 [INFO] agent: shutdown complete
TestKVGetCommand_Missing - 2019/12/06 06:37:29.943332 [INFO] agent: Stopping DNS server 127.0.0.1:14531 (tcp)
TestKVGetCommand - 2019/12/06 06:37:29.943466 [INFO] agent: consul server down
TestKVGetCommand_Missing - 2019/12/06 06:37:29.943488 [INFO] agent: Stopping DNS server 127.0.0.1:14531 (udp)
TestKVGetCommand - 2019/12/06 06:37:29.943512 [INFO] agent: shutdown complete
TestKVGetCommand - 2019/12/06 06:37:29.943603 [INFO] agent: Stopping DNS server 127.0.0.1:14525 (tcp)
TestKVGetCommand_Missing - 2019/12/06 06:37:29.943653 [INFO] agent: Stopping HTTP server 127.0.0.1:14532 (tcp)
TestKVGetCommand - 2019/12/06 06:37:29.943746 [INFO] agent: Stopping DNS server 127.0.0.1:14525 (udp)
TestKVGetCommand_Missing - 2019/12/06 06:37:29.943799 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestKVGetCommand - 2019/12/06 06:37:29.943886 [INFO] agent: Stopping HTTP server 127.0.0.1:14526 (tcp)
TestKVGetCommand_Missing - 2019/12/06 06:37:29.944260 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand - 2019/12/06 06:37:29.944295 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVGetCommand_Missing - 2019/12/06 06:37:29.944397 [INFO] agent: Endpoints down
TestKVGetCommand - 2019/12/06 06:37:29.944446 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
--- PASS: TestKVGetCommand_Missing (2.89s)
TestKVGetCommand - 2019/12/06 06:37:29.944527 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand - 2019/12/06 06:37:29.944565 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand (3.38s)
2019/12/06 06:37:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:30 [INFO]  raft: Node at 127.0.0.1:14548 [Leader] entering Leader state
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.369146 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.369630 [INFO] consul: New leader elected: Node 4de63ef8-d5f8-100e-4c8c-353a90274c4c
2019/12/06 06:37:30 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:30 [INFO]  raft: Node at 127.0.0.1:14542 [Leader] entering Leader state
TestKVGetCommand_Keys - 2019/12/06 06:37:30.453138 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Keys - 2019/12/06 06:37:30.453666 [INFO] consul: New leader elected: Node 9fb520de-89e1-93a6-6840-552754fbe2ec
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.971100 [INFO] agent: Synced node info
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.971161 [DEBUG] http: Request PUT /v1/kv/foo (527.112031ms) from=127.0.0.1:49986
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.972753 [DEBUG] agent: Node info in sync
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.976661 [DEBUG] http: Request GET /v1/kv/foo (990.023µs) from=127.0.0.1:49990
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.978605 [INFO] agent: Requesting shutdown
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.978699 [INFO] consul: shutting down server
TestKVGetCommand_Detailed - 2019/12/06 06:37:30.978751 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Keys - 2019/12/06 06:37:31.044640 [DEBUG] http: Request PUT /v1/kv/foo/bar (435.183208ms) from=127.0.0.1:58412
TestKVGetCommand_Keys - 2019/12/06 06:37:31.045374 [INFO] agent: Synced node info
TestKVGetCommand_Detailed - 2019/12/06 06:37:31.135868 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Detailed - 2019/12/06 06:37:31.236292 [INFO] manager: shutting down
TestKVGetCommand_Keys - 2019/12/06 06:37:31.916033 [DEBUG] agent: Node info in sync
TestKVGetCommand_Keys - 2019/12/06 06:37:31.916137 [DEBUG] agent: Node info in sync
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.192564 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.192795 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.192874 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.192828 [INFO] agent: consul server down
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.192957 [INFO] agent: shutdown complete
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.193018 [INFO] agent: Stopping DNS server 127.0.0.1:14543 (tcp)
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.193161 [INFO] agent: Stopping DNS server 127.0.0.1:14543 (udp)
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.193332 [INFO] agent: Stopping HTTP server 127.0.0.1:14544 (tcp)
TestKVGetCommand_Keys - 2019/12/06 06:37:32.193863 [DEBUG] http: Request PUT /v1/kv/foo/baz (1.146319555s) from=127.0.0.1:58412
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.194426 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Detailed - 2019/12/06 06:37:32.194522 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Detailed (4.26s)
TestKVGetCommand_Keys - 2019/12/06 06:37:32.413240 [DEBUG] http: Request PUT /v1/kv/foo/zip (217.350098ms) from=127.0.0.1:58412
TestKVGetCommand_Keys - 2019/12/06 06:37:32.418479 [DEBUG] http: Request GET /v1/kv/foo/?keys=&separator=%2F (1.917379ms) from=127.0.0.1:58416
TestKVGetCommand_Keys - 2019/12/06 06:37:32.419830 [INFO] agent: Requesting shutdown
TestKVGetCommand_Keys - 2019/12/06 06:37:32.419905 [INFO] consul: shutting down server
TestKVGetCommand_Keys - 2019/12/06 06:37:32.419951 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Keys - 2019/12/06 06:37:32.911098 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Keys - 2019/12/06 06:37:32.984449 [INFO] manager: shutting down
TestKVGetCommand_Keys - 2019/12/06 06:37:33.237499 [ERR] autopilot: Error updating cluster health: error getting Raft configuration raft is already shutdown
TestKVGetCommand_Keys - 2019/12/06 06:37:33.242723 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestKVGetCommand_Keys - 2019/12/06 06:37:33.243044 [INFO] agent: consul server down
TestKVGetCommand_Keys - 2019/12/06 06:37:33.243096 [INFO] agent: shutdown complete
TestKVGetCommand_Keys - 2019/12/06 06:37:33.243154 [INFO] agent: Stopping DNS server 127.0.0.1:14537 (tcp)
TestKVGetCommand_Keys - 2019/12/06 06:37:33.243332 [INFO] agent: Stopping DNS server 127.0.0.1:14537 (udp)
TestKVGetCommand_Keys - 2019/12/06 06:37:33.243583 [INFO] agent: Stopping HTTP server 127.0.0.1:14538 (tcp)
TestKVGetCommand_Keys - 2019/12/06 06:37:33.244677 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Keys - 2019/12/06 06:37:33.244810 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Keys (5.38s)
PASS
ok  	github.com/hashicorp/consul/command/kv/get	11.310s
=== RUN   TestKVImportCommand_noTabs
=== PAUSE TestKVImportCommand_noTabs
=== RUN   TestKVImportCommand
=== PAUSE TestKVImportCommand
=== CONT  TestKVImportCommand_noTabs
=== CONT  TestKVImportCommand
--- PASS: TestKVImportCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVImportCommand - 2019/12/06 06:37:33.881062 [WARN] agent: Node name "Node 47a6e393-3eb2-643c-ea59-44d0932f3377" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVImportCommand - 2019/12/06 06:37:33.882265 [DEBUG] tlsutil: Update with version 1
TestKVImportCommand - 2019/12/06 06:37:33.889699 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:37:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:47a6e393-3eb2-643c-ea59-44d0932f3377 Address:127.0.0.1:20506}]
TestKVImportCommand - 2019/12/06 06:37:35.153603 [INFO] serf: EventMemberJoin: Node 47a6e393-3eb2-643c-ea59-44d0932f3377.dc1 127.0.0.1
2019/12/06 06:37:35 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestKVImportCommand - 2019/12/06 06:37:35.185690 [INFO] serf: EventMemberJoin: Node 47a6e393-3eb2-643c-ea59-44d0932f3377 127.0.0.1
TestKVImportCommand - 2019/12/06 06:37:35.188171 [INFO] consul: Adding LAN server Node 47a6e393-3eb2-643c-ea59-44d0932f3377 (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestKVImportCommand - 2019/12/06 06:37:35.191968 [INFO] consul: Handled member-join event for server "Node 47a6e393-3eb2-643c-ea59-44d0932f3377.dc1" in area "wan"
TestKVImportCommand - 2019/12/06 06:37:35.199431 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestKVImportCommand - 2019/12/06 06:37:35.199549 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestKVImportCommand - 2019/12/06 06:37:35.202955 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestKVImportCommand - 2019/12/06 06:37:35.203149 [INFO] agent: started state syncer
2019/12/06 06:37:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:35 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:35 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:35 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestKVImportCommand - 2019/12/06 06:37:35.744935 [INFO] consul: cluster leadership acquired
TestKVImportCommand - 2019/12/06 06:37:35.745691 [INFO] consul: New leader elected: Node 47a6e393-3eb2-643c-ea59-44d0932f3377
TestKVImportCommand - 2019/12/06 06:37:36.468489 [INFO] agent: Synced node info
TestKVImportCommand - 2019/12/06 06:37:36.468600 [DEBUG] agent: Node info in sync
TestKVImportCommand - 2019/12/06 06:37:36.470815 [DEBUG] http: Request PUT /v1/kv/foo (669.447369ms) from=127.0.0.1:48586
TestKVImportCommand - 2019/12/06 06:37:36.953882 [DEBUG] agent: Node info in sync
TestKVImportCommand - 2019/12/06 06:37:37.362280 [DEBUG] http: Request PUT /v1/kv/foo/a (889.580866ms) from=127.0.0.1:48586
TestKVImportCommand - 2019/12/06 06:37:37.366968 [DEBUG] http: Request GET /v1/kv/foo (1.560703ms) from=127.0.0.1:48588
TestKVImportCommand - 2019/12/06 06:37:37.371252 [DEBUG] http: Request GET /v1/kv/foo/a (1.215362ms) from=127.0.0.1:48588
TestKVImportCommand - 2019/12/06 06:37:37.373073 [INFO] agent: Requesting shutdown
TestKVImportCommand - 2019/12/06 06:37:37.373326 [INFO] consul: shutting down server
TestKVImportCommand - 2019/12/06 06:37:37.373620 [WARN] serf: Shutdown without a Leave
TestKVImportCommand - 2019/12/06 06:37:37.509431 [WARN] serf: Shutdown without a Leave
TestKVImportCommand - 2019/12/06 06:37:37.584528 [INFO] manager: shutting down
TestKVImportCommand - 2019/12/06 06:37:37.717843 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestKVImportCommand - 2019/12/06 06:37:37.718108 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVImportCommand - 2019/12/06 06:37:37.718175 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVImportCommand - 2019/12/06 06:37:37.718624 [INFO] agent: consul server down
TestKVImportCommand - 2019/12/06 06:37:37.718803 [INFO] agent: shutdown complete
TestKVImportCommand - 2019/12/06 06:37:37.718950 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestKVImportCommand - 2019/12/06 06:37:37.719415 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestKVImportCommand - 2019/12/06 06:37:37.719859 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestKVImportCommand - 2019/12/06 06:37:37.721138 [INFO] agent: Waiting for endpoints to shut down
TestKVImportCommand - 2019/12/06 06:37:37.721392 [INFO] agent: Endpoints down
--- PASS: TestKVImportCommand (4.34s)
PASS
ok  	github.com/hashicorp/consul/command/kv/imp	4.590s
?   	github.com/hashicorp/consul/command/kv/impexp	[no test files]
=== RUN   TestKVPutCommand_noTabs
=== PAUSE TestKVPutCommand_noTabs
=== RUN   TestKVPutCommand_Validation
=== PAUSE TestKVPutCommand_Validation
=== RUN   TestKVPutCommand
=== PAUSE TestKVPutCommand
=== RUN   TestKVPutCommand_EmptyDataQuoted
--- SKIP: TestKVPutCommand_EmptyDataQuoted (0.00s)
    kv_put_test.go:108: DM-skipped
=== RUN   TestKVPutCommand_Base64
=== PAUSE TestKVPutCommand_Base64
=== RUN   TestKVPutCommand_File
=== PAUSE TestKVPutCommand_File
=== RUN   TestKVPutCommand_FileNoExist
=== PAUSE TestKVPutCommand_FileNoExist
=== RUN   TestKVPutCommand_Stdin
=== PAUSE TestKVPutCommand_Stdin
=== RUN   TestKVPutCommand_NegativeVal
=== PAUSE TestKVPutCommand_NegativeVal
=== RUN   TestKVPutCommand_Flags
=== PAUSE TestKVPutCommand_Flags
=== RUN   TestKVPutCommand_CAS
=== PAUSE TestKVPutCommand_CAS
=== CONT  TestKVPutCommand_noTabs
=== CONT  TestKVPutCommand_Stdin
=== CONT  TestKVPutCommand_Flags
=== CONT  TestKVPutCommand_CAS
=== CONT  TestKVPutCommand_NegativeVal
--- PASS: TestKVPutCommand_noTabs (0.03s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_CAS - 2019/12/06 06:37:40.884419 [WARN] agent: Node name "Node 511c882c-ac4c-6cde-cb8e-e8dd3e0b2459" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_CAS - 2019/12/06 06:37:40.885561 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Stdin - 2019/12/06 06:37:40.897287 [WARN] agent: Node name "Node 8db51472-05f2-0a9a-06bc-858c4e06d43b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_Stdin - 2019/12/06 06:37:40.897809 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Flags - 2019/12/06 06:37:40.940977 [WARN] agent: Node name "Node 74de0358-bd74-4084-4e0b-a8dbdc33630e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Flags - 2019/12/06 06:37:40.941759 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:40.941959 [WARN] agent: Node name "Node 505397f0-5574-f7de-edee-fc4ac79f0864" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:40.942431 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_CAS - 2019/12/06 06:37:40.944930 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_Stdin - 2019/12/06 06:37:40.945486 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:40.945782 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_Flags - 2019/12/06 06:37:40.946040 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:37:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:505397f0-5574-f7de-edee-fc4ac79f0864 Address:127.0.0.1:19024}]
2019/12/06 06:37:41 [INFO]  raft: Node at 127.0.0.1:19024 [Follower] entering Follower state (Leader: "")
2019/12/06 06:37:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:511c882c-ac4c-6cde-cb8e-e8dd3e0b2459 Address:127.0.0.1:19018}]
2019/12/06 06:37:41 [INFO]  raft: Node at 127.0.0.1:19018 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:41.932074 [INFO] serf: EventMemberJoin: Node 505397f0-5574-f7de-edee-fc4ac79f0864.dc1 127.0.0.1
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:41.936146 [INFO] serf: EventMemberJoin: Node 505397f0-5574-f7de-edee-fc4ac79f0864 127.0.0.1
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:41.938138 [INFO] consul: Adding LAN server Node 505397f0-5574-f7de-edee-fc4ac79f0864 (Addr: tcp/127.0.0.1:19024) (DC: dc1)
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:41.940487 [INFO] consul: Handled member-join event for server "Node 505397f0-5574-f7de-edee-fc4ac79f0864.dc1" in area "wan"
TestKVPutCommand_CAS - 2019/12/06 06:37:41.941203 [INFO] serf: EventMemberJoin: Node 511c882c-ac4c-6cde-cb8e-e8dd3e0b2459.dc1 127.0.0.1
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:41.944848 [INFO] agent: Started DNS server 127.0.0.1:19019 (tcp)
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:41.946322 [INFO] agent: Started DNS server 127.0.0.1:19019 (udp)
TestKVPutCommand_CAS - 2019/12/06 06:37:41.952017 [INFO] serf: EventMemberJoin: Node 511c882c-ac4c-6cde-cb8e-e8dd3e0b2459 127.0.0.1
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:41.954244 [INFO] agent: Started HTTP server on 127.0.0.1:19020 (tcp)
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:41.954555 [INFO] agent: started state syncer
2019/12/06 06:37:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:41 [INFO]  raft: Node at 127.0.0.1:19024 [Candidate] entering Candidate state in term 2
TestKVPutCommand_CAS - 2019/12/06 06:37:41.958918 [INFO] consul: Adding LAN server Node 511c882c-ac4c-6cde-cb8e-e8dd3e0b2459 (Addr: tcp/127.0.0.1:19018) (DC: dc1)
TestKVPutCommand_CAS - 2019/12/06 06:37:41.960290 [INFO] agent: Started DNS server 127.0.0.1:19013 (udp)
TestKVPutCommand_CAS - 2019/12/06 06:37:41.960550 [INFO] consul: Handled member-join event for server "Node 511c882c-ac4c-6cde-cb8e-e8dd3e0b2459.dc1" in area "wan"
TestKVPutCommand_CAS - 2019/12/06 06:37:41.961061 [INFO] agent: Started DNS server 127.0.0.1:19013 (tcp)
TestKVPutCommand_CAS - 2019/12/06 06:37:41.963452 [INFO] agent: Started HTTP server on 127.0.0.1:19014 (tcp)
TestKVPutCommand_CAS - 2019/12/06 06:37:41.963629 [INFO] agent: started state syncer
2019/12/06 06:37:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:41 [INFO]  raft: Node at 127.0.0.1:19018 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:74de0358-bd74-4084-4e0b-a8dbdc33630e Address:127.0.0.1:19012}]
2019/12/06 06:37:42 [INFO]  raft: Node at 127.0.0.1:19012 [Follower] entering Follower state (Leader: "")
2019/12/06 06:37:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:42 [INFO]  raft: Node at 127.0.0.1:19012 [Candidate] entering Candidate state in term 2
TestKVPutCommand_Flags - 2019/12/06 06:37:42.061521 [INFO] serf: EventMemberJoin: Node 74de0358-bd74-4084-4e0b-a8dbdc33630e.dc1 127.0.0.1
TestKVPutCommand_Flags - 2019/12/06 06:37:42.065376 [INFO] serf: EventMemberJoin: Node 74de0358-bd74-4084-4e0b-a8dbdc33630e 127.0.0.1
TestKVPutCommand_Flags - 2019/12/06 06:37:42.066495 [INFO] consul: Adding LAN server Node 74de0358-bd74-4084-4e0b-a8dbdc33630e (Addr: tcp/127.0.0.1:19012) (DC: dc1)
TestKVPutCommand_Flags - 2019/12/06 06:37:42.066892 [INFO] consul: Handled member-join event for server "Node 74de0358-bd74-4084-4e0b-a8dbdc33630e.dc1" in area "wan"
TestKVPutCommand_Flags - 2019/12/06 06:37:42.067094 [INFO] agent: Started DNS server 127.0.0.1:19007 (udp)
TestKVPutCommand_Flags - 2019/12/06 06:37:42.067416 [INFO] agent: Started DNS server 127.0.0.1:19007 (tcp)
TestKVPutCommand_Flags - 2019/12/06 06:37:42.071769 [INFO] agent: Started HTTP server on 127.0.0.1:19008 (tcp)
TestKVPutCommand_Flags - 2019/12/06 06:37:42.072183 [INFO] agent: started state syncer
2019/12/06 06:37:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8db51472-05f2-0a9a-06bc-858c4e06d43b Address:127.0.0.1:19006}]
2019/12/06 06:37:42 [INFO]  raft: Node at 127.0.0.1:19006 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.135180 [INFO] serf: EventMemberJoin: Node 8db51472-05f2-0a9a-06bc-858c4e06d43b.dc1 127.0.0.1
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.141925 [INFO] serf: EventMemberJoin: Node 8db51472-05f2-0a9a-06bc-858c4e06d43b 127.0.0.1
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.149752 [INFO] consul: Handled member-join event for server "Node 8db51472-05f2-0a9a-06bc-858c4e06d43b.dc1" in area "wan"
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.149852 [INFO] consul: Adding LAN server Node 8db51472-05f2-0a9a-06bc-858c4e06d43b (Addr: tcp/127.0.0.1:19006) (DC: dc1)
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.151934 [INFO] agent: Started DNS server 127.0.0.1:19001 (udp)
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.152159 [INFO] agent: Started DNS server 127.0.0.1:19001 (tcp)
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.154872 [INFO] agent: Started HTTP server on 127.0.0.1:19002 (tcp)
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.155117 [INFO] agent: started state syncer
2019/12/06 06:37:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:42 [INFO]  raft: Node at 127.0.0.1:19006 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:42 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:42 [INFO]  raft: Node at 127.0.0.1:19024 [Leader] entering Leader state
2019/12/06 06:37:42 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:42 [INFO]  raft: Node at 127.0.0.1:19018 [Leader] entering Leader state
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:42.604979 [INFO] consul: cluster leadership acquired
2019/12/06 06:37:42 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:42 [INFO]  raft: Node at 127.0.0.1:19012 [Leader] entering Leader state
TestKVPutCommand_Flags - 2019/12/06 06:37:42.605283 [INFO] consul: cluster leadership acquired
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:42.605488 [INFO] consul: New leader elected: Node 505397f0-5574-f7de-edee-fc4ac79f0864
TestKVPutCommand_Flags - 2019/12/06 06:37:42.605548 [INFO] consul: New leader elected: Node 74de0358-bd74-4084-4e0b-a8dbdc33630e
TestKVPutCommand_CAS - 2019/12/06 06:37:42.605863 [INFO] consul: cluster leadership acquired
TestKVPutCommand_CAS - 2019/12/06 06:37:42.606248 [INFO] consul: New leader elected: Node 511c882c-ac4c-6cde-cb8e-e8dd3e0b2459
2019/12/06 06:37:42 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:42 [INFO]  raft: Node at 127.0.0.1:19006 [Leader] entering Leader state
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.687219 [INFO] consul: cluster leadership acquired
TestKVPutCommand_Stdin - 2019/12/06 06:37:42.687625 [INFO] consul: New leader elected: Node 8db51472-05f2-0a9a-06bc-858c4e06d43b
TestKVPutCommand_Flags - 2019/12/06 06:37:42.918666 [INFO] agent: Synced node info
TestKVPutCommand_Flags - 2019/12/06 06:37:42.918791 [DEBUG] agent: Node info in sync
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:42.922428 [INFO] agent: Synced node info
TestKVPutCommand_CAS - 2019/12/06 06:37:42.931615 [DEBUG] http: Request PUT /v1/kv/foo (241.848673ms) from=127.0.0.1:43686
TestKVPutCommand_CAS - 2019/12/06 06:37:42.933072 [INFO] agent: Synced node info
TestKVPutCommand_CAS - 2019/12/06 06:37:42.933196 [DEBUG] agent: Node info in sync
TestKVPutCommand_CAS - 2019/12/06 06:37:43.020574 [DEBUG] agent: Node info in sync
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.072483 [DEBUG] http: Request PUT /v1/kv/foo (349.744537ms) from=127.0.0.1:53166
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.088758 [DEBUG] http: Request GET /v1/kv/foo (10.204573ms) from=127.0.0.1:53174
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.110656 [INFO] agent: Requesting shutdown
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.110865 [INFO] consul: shutting down server
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.110992 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.226061 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.295944 [INFO] manager: shutting down
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.296110 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.296409 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.297618 [INFO] agent: consul server down
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.297669 [INFO] agent: shutdown complete
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.297718 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (tcp)
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.297842 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (udp)
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.297977 [INFO] agent: Stopping HTTP server 127.0.0.1:19002 (tcp)
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.298668 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.298813 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.298863 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestKVPutCommand_Stdin - 2019/12/06 06:37:43.299005 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_Stdin (2.58s)
=== CONT  TestKVPutCommand_Base64
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.300318 [DEBUG] http: Request PUT /v1/kv/foo (389.51647ms) from=127.0.0.1:45362
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.304353 [DEBUG] http: Request GET /v1/kv/foo (946.689µs) from=127.0.0.1:45370
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.305768 [INFO] agent: Requesting shutdown
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.305845 [INFO] consul: shutting down server
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.305893 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Flags - 2019/12/06 06:37:43.393806 [DEBUG] http: Request PUT /v1/kv/foo?flags=12345 (463.69921ms) from=127.0.0.1:57972
TestKVPutCommand_Flags - 2019/12/06 06:37:43.398621 [DEBUG] http: Request GET /v1/kv/foo (1.84071ms) from=127.0.0.1:57980
TestKVPutCommand_Flags - 2019/12/06 06:37:43.400224 [INFO] agent: Requesting shutdown
TestKVPutCommand_Flags - 2019/12/06 06:37:43.400319 [INFO] consul: shutting down server
TestKVPutCommand_Flags - 2019/12/06 06:37:43.400361 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Base64 - 2019/12/06 06:37:43.422126 [WARN] agent: Node name "Node feeb5ea3-1c8a-399b-f52c-ab32fd976ead" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_Base64 - 2019/12/06 06:37:43.424253 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_Base64 - 2019/12/06 06:37:43.427930 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.501121 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Flags - 2019/12/06 06:37:43.503017 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.576264 [INFO] manager: shutting down
TestKVPutCommand_CAS - 2019/12/06 06:37:43.578298 [DEBUG] http: Request PUT /v1/kv/foo?cas=123 (639.862675ms) from=127.0.0.1:43694
TestKVPutCommand_Flags - 2019/12/06 06:37:43.580797 [INFO] manager: shutting down
TestKVPutCommand_CAS - 2019/12/06 06:37:43.583287 [DEBUG] http: Request GET /v1/kv/foo (749.351µs) from=127.0.0.1:43686
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.659555 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.659763 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVPutCommand_Flags - 2019/12/06 06:37:43.659799 [INFO] agent: consul server down
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.659827 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVPutCommand_Flags - 2019/12/06 06:37:43.659840 [INFO] agent: shutdown complete
TestKVPutCommand_Flags - 2019/12/06 06:37:43.659904 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand_Flags - 2019/12/06 06:37:43.659912 [INFO] agent: Stopping DNS server 127.0.0.1:19007 (tcp)
TestKVPutCommand_Flags - 2019/12/06 06:37:43.660096 [INFO] agent: Stopping DNS server 127.0.0.1:19007 (udp)
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.660173 [INFO] agent: consul server down
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.660210 [INFO] agent: shutdown complete
TestKVPutCommand_Flags - 2019/12/06 06:37:43.660230 [INFO] agent: Stopping HTTP server 127.0.0.1:19008 (tcp)
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.660255 [INFO] agent: Stopping DNS server 127.0.0.1:19019 (tcp)
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.660372 [INFO] agent: Stopping DNS server 127.0.0.1:19019 (udp)
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.660513 [INFO] agent: Stopping HTTP server 127.0.0.1:19020 (tcp)
TestKVPutCommand_Flags - 2019/12/06 06:37:43.660898 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Flags - 2019/12/06 06:37:43.661000 [INFO] agent: Endpoints down
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.661045 [INFO] agent: Waiting for endpoints to shut down
--- PASS: TestKVPutCommand_Flags (2.94s)
=== CONT  TestKVPutCommand_FileNoExist
TestKVPutCommand_NegativeVal - 2019/12/06 06:37:43.661183 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_NegativeVal (2.92s)
=== CONT  TestKVPutCommand_File
=== CONT  TestKVPutCommand
--- PASS: TestKVPutCommand_FileNoExist (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand - 2019/12/06 06:37:43.750004 [WARN] agent: Node name "Node c1ae4a74-9f87-28a9-fc9d-c348beff5c26" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand - 2019/12/06 06:37:43.750677 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand - 2019/12/06 06:37:43.753058 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_File - 2019/12/06 06:37:43.783991 [WARN] agent: Node name "Node e2429411-73fc-6d1a-22ff-beec1b53792a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_File - 2019/12/06 06:37:43.784789 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_File - 2019/12/06 06:37:43.787530 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_CAS - 2019/12/06 06:37:44.023132 [DEBUG] http: Request PUT /v1/kv/foo?cas=4 (436.049228ms) from=127.0.0.1:43702
TestKVPutCommand_CAS - 2019/12/06 06:37:44.027410 [DEBUG] http: Request GET /v1/kv/foo (1.868377ms) from=127.0.0.1:43686
TestKVPutCommand_CAS - 2019/12/06 06:37:44.029569 [INFO] agent: Requesting shutdown
TestKVPutCommand_CAS - 2019/12/06 06:37:44.029678 [INFO] consul: shutting down server
TestKVPutCommand_CAS - 2019/12/06 06:37:44.029720 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_CAS - 2019/12/06 06:37:44.094898 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_CAS - 2019/12/06 06:37:44.179802 [INFO] manager: shutting down
TestKVPutCommand_CAS - 2019/12/06 06:37:44.180253 [INFO] agent: consul server down
TestKVPutCommand_CAS - 2019/12/06 06:37:44.180299 [INFO] agent: shutdown complete
TestKVPutCommand_CAS - 2019/12/06 06:37:44.180353 [INFO] agent: Stopping DNS server 127.0.0.1:19013 (tcp)
TestKVPutCommand_CAS - 2019/12/06 06:37:44.180476 [INFO] agent: Stopping DNS server 127.0.0.1:19013 (udp)
TestKVPutCommand_CAS - 2019/12/06 06:37:44.180611 [INFO] agent: Stopping HTTP server 127.0.0.1:19014 (tcp)
TestKVPutCommand_CAS - 2019/12/06 06:37:44.181324 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_CAS - 2019/12/06 06:37:44.181490 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_CAS (3.46s)
=== CONT  TestKVPutCommand_Validation
TestKVPutCommand_CAS - 2019/12/06 06:37:44.227666 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
--- PASS: TestKVPutCommand_Validation (0.10s)
2019/12/06 06:37:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:feeb5ea3-1c8a-399b-f52c-ab32fd976ead Address:127.0.0.1:19030}]
2019/12/06 06:37:44 [INFO]  raft: Node at 127.0.0.1:19030 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_Base64 - 2019/12/06 06:37:44.455350 [INFO] serf: EventMemberJoin: Node feeb5ea3-1c8a-399b-f52c-ab32fd976ead.dc1 127.0.0.1
TestKVPutCommand_Base64 - 2019/12/06 06:37:44.460917 [INFO] serf: EventMemberJoin: Node feeb5ea3-1c8a-399b-f52c-ab32fd976ead 127.0.0.1
TestKVPutCommand_Base64 - 2019/12/06 06:37:44.462469 [INFO] consul: Adding LAN server Node feeb5ea3-1c8a-399b-f52c-ab32fd976ead (Addr: tcp/127.0.0.1:19030) (DC: dc1)
TestKVPutCommand_Base64 - 2019/12/06 06:37:44.463140 [INFO] consul: Handled member-join event for server "Node feeb5ea3-1c8a-399b-f52c-ab32fd976ead.dc1" in area "wan"
TestKVPutCommand_Base64 - 2019/12/06 06:37:44.464817 [INFO] agent: Started DNS server 127.0.0.1:19025 (tcp)
TestKVPutCommand_Base64 - 2019/12/06 06:37:44.464915 [INFO] agent: Started DNS server 127.0.0.1:19025 (udp)
TestKVPutCommand_Base64 - 2019/12/06 06:37:44.467355 [INFO] agent: Started HTTP server on 127.0.0.1:19026 (tcp)
TestKVPutCommand_Base64 - 2019/12/06 06:37:44.467750 [INFO] agent: started state syncer
2019/12/06 06:37:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:44 [INFO]  raft: Node at 127.0.0.1:19030 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e2429411-73fc-6d1a-22ff-beec1b53792a Address:127.0.0.1:19036}]
2019/12/06 06:37:44 [INFO]  raft: Node at 127.0.0.1:19036 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_File - 2019/12/06 06:37:44.814514 [INFO] serf: EventMemberJoin: Node e2429411-73fc-6d1a-22ff-beec1b53792a.dc1 127.0.0.1
TestKVPutCommand_File - 2019/12/06 06:37:44.820190 [INFO] serf: EventMemberJoin: Node e2429411-73fc-6d1a-22ff-beec1b53792a 127.0.0.1
TestKVPutCommand_File - 2019/12/06 06:37:44.822077 [INFO] consul: Adding LAN server Node e2429411-73fc-6d1a-22ff-beec1b53792a (Addr: tcp/127.0.0.1:19036) (DC: dc1)
TestKVPutCommand_File - 2019/12/06 06:37:44.822730 [INFO] consul: Handled member-join event for server "Node e2429411-73fc-6d1a-22ff-beec1b53792a.dc1" in area "wan"
TestKVPutCommand_File - 2019/12/06 06:37:44.824431 [INFO] agent: Started DNS server 127.0.0.1:19031 (tcp)
TestKVPutCommand_File - 2019/12/06 06:37:44.825081 [INFO] agent: Started DNS server 127.0.0.1:19031 (udp)
TestKVPutCommand_File - 2019/12/06 06:37:44.827744 [INFO] agent: Started HTTP server on 127.0.0.1:19032 (tcp)
TestKVPutCommand_File - 2019/12/06 06:37:44.827845 [INFO] agent: started state syncer
2019/12/06 06:37:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:44 [INFO]  raft: Node at 127.0.0.1:19036 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c1ae4a74-9f87-28a9-fc9d-c348beff5c26 Address:127.0.0.1:19042}]
TestKVPutCommand - 2019/12/06 06:37:44.988554 [INFO] serf: EventMemberJoin: Node c1ae4a74-9f87-28a9-fc9d-c348beff5c26.dc1 127.0.0.1
2019/12/06 06:37:44 [INFO]  raft: Node at 127.0.0.1:19042 [Follower] entering Follower state (Leader: "")
TestKVPutCommand - 2019/12/06 06:37:45.001248 [INFO] serf: EventMemberJoin: Node c1ae4a74-9f87-28a9-fc9d-c348beff5c26 127.0.0.1
TestKVPutCommand - 2019/12/06 06:37:45.002968 [INFO] consul: Adding LAN server Node c1ae4a74-9f87-28a9-fc9d-c348beff5c26 (Addr: tcp/127.0.0.1:19042) (DC: dc1)
TestKVPutCommand - 2019/12/06 06:37:45.003821 [INFO] consul: Handled member-join event for server "Node c1ae4a74-9f87-28a9-fc9d-c348beff5c26.dc1" in area "wan"
TestKVPutCommand - 2019/12/06 06:37:45.007510 [INFO] agent: Started DNS server 127.0.0.1:19037 (tcp)
TestKVPutCommand - 2019/12/06 06:37:45.007750 [INFO] agent: Started DNS server 127.0.0.1:19037 (udp)
TestKVPutCommand - 2019/12/06 06:37:45.011816 [INFO] agent: Started HTTP server on 127.0.0.1:19038 (tcp)
TestKVPutCommand - 2019/12/06 06:37:45.011924 [INFO] agent: started state syncer
2019/12/06 06:37:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:45 [INFO]  raft: Node at 127.0.0.1:19042 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:45 [INFO]  raft: Node at 127.0.0.1:19030 [Leader] entering Leader state
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.096971 [INFO] consul: cluster leadership acquired
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.097409 [INFO] consul: New leader elected: Node feeb5ea3-1c8a-399b-f52c-ab32fd976ead
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.420407 [INFO] agent: Synced node info
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.421371 [DEBUG] http: Request PUT /v1/kv/foo (191.019481ms) from=127.0.0.1:52510
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.425553 [DEBUG] http: Request GET /v1/kv/foo (897.022µs) from=127.0.0.1:52512
2019/12/06 06:37:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:45 [INFO]  raft: Node at 127.0.0.1:19036 [Leader] entering Leader state
TestKVPutCommand_File - 2019/12/06 06:37:45.429153 [INFO] consul: cluster leadership acquired
TestKVPutCommand_File - 2019/12/06 06:37:45.429638 [INFO] consul: New leader elected: Node e2429411-73fc-6d1a-22ff-beec1b53792a
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.431631 [INFO] agent: Requesting shutdown
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.431893 [INFO] consul: shutting down server
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.432434 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.517823 [WARN] serf: Shutdown without a Leave
2019/12/06 06:37:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:45 [INFO]  raft: Node at 127.0.0.1:19042 [Leader] entering Leader state
TestKVPutCommand - 2019/12/06 06:37:45.596831 [INFO] consul: cluster leadership acquired
TestKVPutCommand - 2019/12/06 06:37:45.597244 [INFO] consul: New leader elected: Node c1ae4a74-9f87-28a9-fc9d-c348beff5c26
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.598541 [INFO] manager: shutting down
TestKVPutCommand_File - 2019/12/06 06:37:45.760889 [INFO] agent: Synced node info
TestKVPutCommand_File - 2019/12/06 06:37:45.763895 [DEBUG] http: Request PUT /v1/kv/foo (246.72112ms) from=127.0.0.1:39868
TestKVPutCommand_File - 2019/12/06 06:37:45.768462 [DEBUG] http: Request GET /v1/kv/foo (988.023µs) from=127.0.0.1:39872
TestKVPutCommand_File - 2019/12/06 06:37:45.771211 [INFO] agent: Requesting shutdown
TestKVPutCommand_File - 2019/12/06 06:37:45.771312 [INFO] consul: shutting down server
TestKVPutCommand_File - 2019/12/06 06:37:45.771355 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.859596 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.859804 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.859841 [INFO] agent: consul server down
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.859869 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.859884 [INFO] agent: shutdown complete
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.859977 [INFO] agent: Stopping DNS server 127.0.0.1:19025 (tcp)
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.860114 [INFO] agent: Stopping DNS server 127.0.0.1:19025 (udp)
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.860257 [INFO] agent: Stopping HTTP server 127.0.0.1:19026 (tcp)
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.860923 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Base64 - 2019/12/06 06:37:45.861038 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_Base64 (2.56s)
TestKVPutCommand_File - 2019/12/06 06:37:45.926169 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_File - 2019/12/06 06:37:46.001225 [INFO] manager: shutting down
TestKVPutCommand - 2019/12/06 06:37:46.068949 [INFO] agent: Synced node info
TestKVPutCommand - 2019/12/06 06:37:46.069081 [DEBUG] agent: Node info in sync
TestKVPutCommand_File - 2019/12/06 06:37:46.071746 [INFO] agent: consul server down
TestKVPutCommand_File - 2019/12/06 06:37:46.071806 [INFO] agent: shutdown complete
TestKVPutCommand_File - 2019/12/06 06:37:46.071855 [INFO] agent: Stopping DNS server 127.0.0.1:19031 (tcp)
TestKVPutCommand_File - 2019/12/06 06:37:46.071984 [INFO] agent: Stopping DNS server 127.0.0.1:19031 (udp)
TestKVPutCommand_File - 2019/12/06 06:37:46.072129 [INFO] agent: Stopping HTTP server 127.0.0.1:19032 (tcp)
TestKVPutCommand_File - 2019/12/06 06:37:46.072666 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_File - 2019/12/06 06:37:46.074460 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVPutCommand_File - 2019/12/06 06:37:46.074551 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_File (2.41s)
TestKVPutCommand_File - 2019/12/06 06:37:46.074864 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVPutCommand_File - 2019/12/06 06:37:46.074988 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestKVPutCommand_File - 2019/12/06 06:37:46.075041 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestKVPutCommand - 2019/12/06 06:37:46.076808 [DEBUG] http: Request PUT /v1/kv/foo (430.360094ms) from=127.0.0.1:60518
TestKVPutCommand - 2019/12/06 06:37:46.080734 [DEBUG] http: Request GET /v1/kv/foo (803.019µs) from=127.0.0.1:60522
TestKVPutCommand - 2019/12/06 06:37:46.084578 [INFO] agent: Requesting shutdown
TestKVPutCommand - 2019/12/06 06:37:46.084675 [INFO] consul: shutting down server
TestKVPutCommand - 2019/12/06 06:37:46.084720 [WARN] serf: Shutdown without a Leave
TestKVPutCommand - 2019/12/06 06:37:46.151142 [WARN] serf: Shutdown without a Leave
TestKVPutCommand - 2019/12/06 06:37:46.226230 [INFO] manager: shutting down
TestKVPutCommand - 2019/12/06 06:37:46.284785 [INFO] agent: consul server down
TestKVPutCommand - 2019/12/06 06:37:46.284861 [INFO] agent: shutdown complete
TestKVPutCommand - 2019/12/06 06:37:46.284915 [INFO] agent: Stopping DNS server 127.0.0.1:19037 (tcp)
TestKVPutCommand - 2019/12/06 06:37:46.285041 [INFO] agent: Stopping DNS server 127.0.0.1:19037 (udp)
TestKVPutCommand - 2019/12/06 06:37:46.285180 [INFO] agent: Stopping HTTP server 127.0.0.1:19038 (tcp)
TestKVPutCommand - 2019/12/06 06:37:46.285830 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand - 2019/12/06 06:37:46.285921 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVPutCommand - 2019/12/06 06:37:46.286266 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVPutCommand - 2019/12/06 06:37:46.286450 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand (2.62s)
PASS
ok  	github.com/hashicorp/consul/command/kv/put	5.859s
=== RUN   TestLeaveCommand_noTabs
=== PAUSE TestLeaveCommand_noTabs
=== RUN   TestLeaveCommand
=== PAUSE TestLeaveCommand
=== RUN   TestLeaveCommand_FailOnNonFlagArgs
=== PAUSE TestLeaveCommand_FailOnNonFlagArgs
=== CONT  TestLeaveCommand_noTabs
=== CONT  TestLeaveCommand_FailOnNonFlagArgs
=== CONT  TestLeaveCommand
--- PASS: TestLeaveCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:47.215726 [WARN] agent: Node name "Node 3133824e-b239-170f-2ed0-725104c4bcbb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:47.216764 [DEBUG] tlsutil: Update with version 1
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:47.240578 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLeaveCommand - 2019/12/06 06:37:47.257035 [WARN] agent: Node name "Node eccd5ff9-1e09-c5c4-da13-43991b65f096" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLeaveCommand - 2019/12/06 06:37:47.257711 [DEBUG] tlsutil: Update with version 1
TestLeaveCommand - 2019/12/06 06:37:47.268973 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:37:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3133824e-b239-170f-2ed0-725104c4bcbb Address:127.0.0.1:38512}]
2019/12/06 06:37:48 [INFO]  raft: Node at 127.0.0.1:38512 [Follower] entering Follower state (Leader: "")
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:48.781197 [INFO] serf: EventMemberJoin: Node 3133824e-b239-170f-2ed0-725104c4bcbb.dc1 127.0.0.1
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:48.785516 [INFO] serf: EventMemberJoin: Node 3133824e-b239-170f-2ed0-725104c4bcbb 127.0.0.1
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:48.786380 [INFO] consul: Adding LAN server Node 3133824e-b239-170f-2ed0-725104c4bcbb (Addr: tcp/127.0.0.1:38512) (DC: dc1)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:48.786744 [INFO] consul: Handled member-join event for server "Node 3133824e-b239-170f-2ed0-725104c4bcbb.dc1" in area "wan"
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:48.787800 [INFO] agent: Started DNS server 127.0.0.1:38507 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:48.788385 [INFO] agent: Started DNS server 127.0.0.1:38507 (udp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:48.791306 [INFO] agent: Started HTTP server on 127.0.0.1:38508 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:48.791466 [INFO] agent: started state syncer
2019/12/06 06:37:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:48 [INFO]  raft: Node at 127.0.0.1:38512 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eccd5ff9-1e09-c5c4-da13-43991b65f096 Address:127.0.0.1:38506}]
2019/12/06 06:37:48 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
TestLeaveCommand - 2019/12/06 06:37:48.847060 [INFO] serf: EventMemberJoin: Node eccd5ff9-1e09-c5c4-da13-43991b65f096.dc1 127.0.0.1
TestLeaveCommand - 2019/12/06 06:37:48.850691 [INFO] serf: EventMemberJoin: Node eccd5ff9-1e09-c5c4-da13-43991b65f096 127.0.0.1
TestLeaveCommand - 2019/12/06 06:37:48.851755 [INFO] consul: Adding LAN server Node eccd5ff9-1e09-c5c4-da13-43991b65f096 (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestLeaveCommand - 2019/12/06 06:37:48.852064 [INFO] consul: Handled member-join event for server "Node eccd5ff9-1e09-c5c4-da13-43991b65f096.dc1" in area "wan"
TestLeaveCommand - 2019/12/06 06:37:48.864655 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestLeaveCommand - 2019/12/06 06:37:48.864756 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestLeaveCommand - 2019/12/06 06:37:48.869532 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestLeaveCommand - 2019/12/06 06:37:48.869714 [INFO] agent: started state syncer
2019/12/06 06:37:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:37:48 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
2019/12/06 06:37:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:49 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
2019/12/06 06:37:49 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:37:49 [INFO]  raft: Node at 127.0.0.1:38512 [Leader] entering Leader state
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:49.536708 [INFO] consul: cluster leadership acquired
TestLeaveCommand - 2019/12/06 06:37:49.537101 [INFO] consul: cluster leadership acquired
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:49.537234 [INFO] consul: New leader elected: Node 3133824e-b239-170f-2ed0-725104c4bcbb
TestLeaveCommand - 2019/12/06 06:37:49.537631 [INFO] consul: New leader elected: Node eccd5ff9-1e09-c5c4-da13-43991b65f096
TestLeaveCommand - 2019/12/06 06:37:49.590886 [INFO] consul: server starting leave
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:49.707088 [INFO] agent: Requesting shutdown
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:49.707216 [INFO] consul: shutting down server
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:49.707271 [WARN] serf: Shutdown without a Leave
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:49.707384 [ERR] agent: failed to sync remote state: No cluster leader
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.762852 [WARN] serf: Shutdown without a Leave
TestLeaveCommand - 2019/12/06 06:37:50.851768 [INFO] serf: EventMemberLeave: Node eccd5ff9-1e09-c5c4-da13-43991b65f096.dc1 127.0.0.1
TestLeaveCommand - 2019/12/06 06:37:50.852092 [INFO] consul: Handled member-leave event for server "Node eccd5ff9-1e09-c5c4-da13-43991b65f096.dc1" in area "wan"
TestLeaveCommand - 2019/12/06 06:37:50.852160 [INFO] manager: shutting down
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.926326 [INFO] manager: shutting down
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.926400 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.926979 [INFO] agent: consul server down
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.927031 [INFO] agent: shutdown complete
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.927096 [INFO] agent: Stopping DNS server 127.0.0.1:38507 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.927231 [INFO] agent: Stopping DNS server 127.0.0.1:38507 (udp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.927383 [INFO] agent: Stopping HTTP server 127.0.0.1:38508 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.927566 [INFO] agent: Waiting for endpoints to shut down
TestLeaveCommand_FailOnNonFlagArgs - 2019/12/06 06:37:50.927628 [INFO] agent: Endpoints down
--- PASS: TestLeaveCommand_FailOnNonFlagArgs (3.88s)
TestLeaveCommand - 2019/12/06 06:37:51.002449 [INFO] agent: Synced node info
TestLeaveCommand - 2019/12/06 06:37:51.002574 [DEBUG] agent: Node info in sync
TestLeaveCommand - 2019/12/06 06:37:51.802077 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLeaveCommand - 2019/12/06 06:37:51.802569 [DEBUG] consul: Skipping self join check for "Node eccd5ff9-1e09-c5c4-da13-43991b65f096" since the cluster is too small
TestLeaveCommand - 2019/12/06 06:37:51.802739 [INFO] consul: member 'Node eccd5ff9-1e09-c5c4-da13-43991b65f096' joined, marking health alive
TestLeaveCommand - 2019/12/06 06:37:52.645075 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLeaveCommand - 2019/12/06 06:37:52.645165 [DEBUG] agent: Node info in sync
TestLeaveCommand - 2019/12/06 06:37:53.144498 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLeaveCommand - 2019/12/06 06:37:53.852368 [INFO] serf: EventMemberLeave: Node eccd5ff9-1e09-c5c4-da13-43991b65f096 127.0.0.1
TestLeaveCommand - 2019/12/06 06:37:53.852709 [INFO] consul: Removing LAN server Node eccd5ff9-1e09-c5c4-da13-43991b65f096 (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestLeaveCommand - 2019/12/06 06:37:53.852923 [WARN] consul: deregistering self (Node eccd5ff9-1e09-c5c4-da13-43991b65f096) should be done by follower
TestLeaveCommand - 2019/12/06 06:37:55.143926 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/06 06:37:56.852843 [INFO] consul: Waiting 5s to drain RPC traffic
TestLeaveCommand - 2019/12/06 06:37:57.143955 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/06 06:37:59.144002 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/06 06:38:01.143929 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/06 06:38:01.143956 [ERR] autopilot: Error promoting servers: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/12/06 06:38:01.853255 [INFO] agent: Requesting shutdown
TestLeaveCommand - 2019/12/06 06:38:01.853394 [INFO] consul: shutting down server
TestLeaveCommand - 2019/12/06 06:38:02.010489 [INFO] agent: consul server down
TestLeaveCommand - 2019/12/06 06:38:02.010573 [INFO] agent: shutdown complete
TestLeaveCommand - 2019/12/06 06:38:02.010673 [DEBUG] http: Request PUT /v1/agent/leave (12.419764984s) from=127.0.0.1:47660
TestLeaveCommand - 2019/12/06 06:38:02.011629 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestLeaveCommand - 2019/12/06 06:38:02.011823 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestLeaveCommand - 2019/12/06 06:38:02.011984 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestLeaveCommand - 2019/12/06 06:38:02.012472 [INFO] agent: Waiting for endpoints to shut down
TestLeaveCommand - 2019/12/06 06:38:02.012564 [INFO] agent: Endpoints down
--- PASS: TestLeaveCommand (14.97s)
PASS
ok  	github.com/hashicorp/consul/command/leave	15.570s
=== RUN   TestLockCommand_noTabs
=== PAUSE TestLockCommand_noTabs
=== RUN   TestLockCommand_BadArgs
=== PAUSE TestLockCommand_BadArgs
=== RUN   TestLockCommand
=== PAUSE TestLockCommand
=== RUN   TestLockCommand_NoShell
=== PAUSE TestLockCommand_NoShell
=== RUN   TestLockCommand_TryLock
=== PAUSE TestLockCommand_TryLock
=== RUN   TestLockCommand_TrySemaphore
=== PAUSE TestLockCommand_TrySemaphore
=== RUN   TestLockCommand_MonitorRetry_Lock_Default
=== PAUSE TestLockCommand_MonitorRetry_Lock_Default
=== RUN   TestLockCommand_MonitorRetry_Semaphore_Default
=== PAUSE TestLockCommand_MonitorRetry_Semaphore_Default
=== RUN   TestLockCommand_MonitorRetry_Lock_Arg
=== PAUSE TestLockCommand_MonitorRetry_Lock_Arg
=== RUN   TestLockCommand_MonitorRetry_Semaphore_Arg
=== PAUSE TestLockCommand_MonitorRetry_Semaphore_Arg
=== RUN   TestLockCommand_ChildExitCode
--- SKIP: TestLockCommand_ChildExitCode (0.00s)
    lock_test.go:302: DM-skipped
=== CONT  TestLockCommand_noTabs
=== CONT  TestLockCommand_MonitorRetry_Lock_Default
=== CONT  TestLockCommand_MonitorRetry_Semaphore_Arg
=== CONT  TestLockCommand_MonitorRetry_Lock_Arg
--- PASS: TestLockCommand_noTabs (0.02s)
=== CONT  TestLockCommand_MonitorRetry_Semaphore_Default
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:21.468668 [WARN] agent: Node name "Node ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:21.469772 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:21.479903 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:21.481716 [WARN] agent: Node name "Node 4ecac839-9566-efd3-a2ca-fe44edcf61a9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:21.482122 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:21.484375 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:21.500816 [WARN] agent: Node name "Node 4c23ac63-62e8-30a3-3e1f-9acdbcb9de90" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:21.502279 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:21.525792 [WARN] agent: Node name "Node 85af6b82-797e-cd14-430f-902a3cf9f16a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:21.526052 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:21.526322 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:21.529391 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:38:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b Address:127.0.0.1:52018}]
2019/12/06 06:38:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4ecac839-9566-efd3-a2ca-fe44edcf61a9 Address:127.0.0.1:52012}]
2019/12/06 06:38:22 [INFO]  raft: Node at 127.0.0.1:52018 [Follower] entering Follower state (Leader: "")
2019/12/06 06:38:22 [INFO]  raft: Node at 127.0.0.1:52012 [Follower] entering Follower state (Leader: "")
2019/12/06 06:38:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:22 [INFO]  raft: Node at 127.0.0.1:52018 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:22 [INFO]  raft: Node at 127.0.0.1:52012 [Candidate] entering Candidate state in term 2
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:23.239087 [INFO] serf: EventMemberJoin: Node 4ecac839-9566-efd3-a2ca-fe44edcf61a9.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:23.242667 [INFO] serf: EventMemberJoin: Node ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:23.256058 [INFO] serf: EventMemberJoin: Node 4ecac839-9566-efd3-a2ca-fe44edcf61a9 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:23.266867 [INFO] agent: Started DNS server 127.0.0.1:52007 (udp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:23.267472 [INFO] agent: Started DNS server 127.0.0.1:52007 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:23.267631 [INFO] consul: Adding LAN server Node 4ecac839-9566-efd3-a2ca-fe44edcf61a9 (Addr: tcp/127.0.0.1:52012) (DC: dc1)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:23.268103 [INFO] consul: Handled member-join event for server "Node 4ecac839-9566-efd3-a2ca-fe44edcf61a9.dc1" in area "wan"
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:23.269937 [INFO] serf: EventMemberJoin: Node ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:23.272168 [INFO] consul: Adding LAN server Node ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b (Addr: tcp/127.0.0.1:52018) (DC: dc1)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:23.273293 [INFO] consul: Handled member-join event for server "Node ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b.dc1" in area "wan"
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:23.273638 [INFO] agent: Started DNS server 127.0.0.1:52013 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:23.273806 [INFO] agent: Started HTTP server on 127.0.0.1:52008 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:23.274080 [INFO] agent: started state syncer
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:23.274903 [INFO] agent: Started DNS server 127.0.0.1:52013 (udp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:23.277283 [INFO] agent: Started HTTP server on 127.0.0.1:52014 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:23.277386 [INFO] agent: started state syncer
2019/12/06 06:38:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:85af6b82-797e-cd14-430f-902a3cf9f16a Address:127.0.0.1:52024}]
2019/12/06 06:38:24 [INFO]  raft: Node at 127.0.0.1:52024 [Follower] entering Follower state (Leader: "")
2019/12/06 06:38:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4c23ac63-62e8-30a3-3e1f-9acdbcb9de90 Address:127.0.0.1:52006}]
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.076942 [INFO] serf: EventMemberJoin: Node 85af6b82-797e-cd14-430f-902a3cf9f16a.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.078361 [INFO] serf: EventMemberJoin: Node 4c23ac63-62e8-30a3-3e1f-9acdbcb9de90.dc1 127.0.0.1
2019/12/06 06:38:24 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.083100 [INFO] serf: EventMemberJoin: Node 85af6b82-797e-cd14-430f-902a3cf9f16a 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.084347 [INFO] consul: Handled member-join event for server "Node 85af6b82-797e-cd14-430f-902a3cf9f16a.dc1" in area "wan"
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.084683 [INFO] consul: Adding LAN server Node 85af6b82-797e-cd14-430f-902a3cf9f16a (Addr: tcp/127.0.0.1:52024) (DC: dc1)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.088265 [INFO] serf: EventMemberJoin: Node 4c23ac63-62e8-30a3-3e1f-9acdbcb9de90 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.098245 [INFO] consul: Adding LAN server Node 4c23ac63-62e8-30a3-3e1f-9acdbcb9de90 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.100373 [INFO] agent: Started DNS server 127.0.0.1:52019 (udp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.101817 [INFO] consul: Handled member-join event for server "Node 4c23ac63-62e8-30a3-3e1f-9acdbcb9de90.dc1" in area "wan"
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.108299 [INFO] agent: Started DNS server 127.0.0.1:52019 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.110329 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.112622 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.117725 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.117867 [INFO] agent: started state syncer
2019/12/06 06:38:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:24 [INFO]  raft: Node at 127.0.0.1:52024 [Candidate] entering Candidate state in term 2
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.124228 [INFO] agent: Started HTTP server on 127.0.0.1:52020 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.124388 [INFO] agent: started state syncer
2019/12/06 06:38:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:24 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:24 [INFO]  raft: Node at 127.0.0.1:52012 [Leader] entering Leader state
2019/12/06 06:38:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:24 [INFO]  raft: Node at 127.0.0.1:52018 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:24.661601 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:24.661929 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:24.662121 [INFO] consul: New leader elected: Node 4ecac839-9566-efd3-a2ca-fe44edcf61a9
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:24.662304 [INFO] consul: New leader elected: Node ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b
2019/12/06 06:38:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:24 [INFO]  raft: Node at 127.0.0.1:52024 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.735718 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:24.736269 [INFO] consul: New leader elected: Node 85af6b82-797e-cd14-430f-902a3cf9f16a
2019/12/06 06:38:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:24 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.819555 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:24.820115 [INFO] consul: New leader elected: Node 4c23ac63-62e8-30a3-3e1f-9acdbcb9de90
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:25.003067 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:25.137142 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:25.269320 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:25.467355 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:25.467487 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:25.478637 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:25.478763 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:26.757476 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:26.757652 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:27.266390 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:27.266512 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:27.273087 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:27.375888 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:27.686785 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:27.687248 [DEBUG] consul: Skipping self join check for "Node 4ecac839-9566-efd3-a2ca-fe44edcf61a9" since the cluster is too small
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:27.687399 [INFO] consul: member 'Node 4ecac839-9566-efd3-a2ca-fe44edcf61a9' joined, marking health alive
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:27.689547 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:27.690559 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:27.902141 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:27.904318 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:27.904758 [DEBUG] consul: Skipping self join check for "Node ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b" since the cluster is too small
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:27.904918 [INFO] consul: member 'Node ec220bd1-cbaf-ae76-bfcb-12ba8c1fed6b' joined, marking health alive
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:27.908096 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:27.908494 [DEBUG] consul: Skipping self join check for "Node 85af6b82-797e-cd14-430f-902a3cf9f16a" since the cluster is too small
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:27.908647 [INFO] consul: member 'Node 85af6b82-797e-cd14-430f-902a3cf9f16a' joined, marking health alive
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:27.929747 [DEBUG] http: Request GET /v1/agent/self (11.202262ms) from=127.0.0.1:52916
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:28.086532 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:28.086993 [DEBUG] consul: Skipping self join check for "Node 4c23ac63-62e8-30a3-3e1f-9acdbcb9de90" since the cluster is too small
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:28.087158 [INFO] consul: member 'Node 4c23ac63-62e8-30a3-3e1f-9acdbcb9de90' joined, marking health alive
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:28.106695 [DEBUG] http: Request GET /v1/agent/self (13.289312ms) from=127.0.0.1:46188
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:28.114296 [DEBUG] http: Request GET /v1/agent/self (12.726965ms) from=127.0.0.1:58032
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:28.306155 [DEBUG] http: Request PUT /v1/session/create (307.402877ms) from=127.0.0.1:52916
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:28.411770 [DEBUG] http: Request PUT /v1/session/create (285.254691ms) from=127.0.0.1:58032
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:28.412871 [DEBUG] http: Request PUT /v1/session/create (280.750919ms) from=127.0.0.1:46188
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:28.421468 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (267.006µs) from=127.0.0.1:46188
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:28.426691 [DEBUG] http: Request GET /v1/agent/self (6.316148ms) from=127.0.0.1:37268
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:28.520943 [DEBUG] http: Request PUT /v1/kv/test/prefix/5096e31a-6440-f112-6b7c-4f0c72a40c14?acquire=5096e31a-6440-f112-6b7c-4f0c72a40c14&flags=16210313421097356768 (212.008639ms) from=127.0.0.1:52916
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:28.536239 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=15000ms (1.309364ms) from=127.0.0.1:52916
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:28.695081 [DEBUG] http: Request PUT /v1/kv/test/prefix/61b5b338-c0e6-9e2f-7314-cd36e2c9fee0?acquire=61b5b338-c0e6-9e2f-7314-cd36e2c9fee0&flags=16210313421097356768 (274.926782ms) from=127.0.0.1:58032
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:28.696823 [DEBUG] http: Request PUT /v1/session/create (253.454945ms) from=127.0.0.1:37268
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:28.697221 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=e15c80b8-dfa7-7ad9-84fb-28722a10150b&flags=3304740253564472344 (274.154764ms) from=127.0.0.1:46188
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:28.712810 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (386.009µs) from=127.0.0.1:37268
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:28.724437 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=15000ms (18.18576ms) from=127.0.0.1:58032
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:28.725923 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (9.305218ms) from=127.0.0.1:46188
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:28.972767 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (411.027641ms) from=127.0.0.1:52916
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:28.983837 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (3.979094ms) from=127.0.0.1:52916
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:29.032727 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.523369ms) from=127.0.0.1:52926
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:29.081731 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=710cab2e-d056-72bc-edb8-42a182739cbe&flags=3304740253564472344 (359.790106ms) from=127.0.0.1:37268
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:29.084340 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (355.520005ms) from=127.0.0.1:58032
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:29.100769 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (5.7118ms) from=127.0.0.1:37268
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:29.101342 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (6.348149ms) from=127.0.0.1:58032
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:29.173581 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.374366ms) from=127.0.0.1:58042
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:29.278177 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (242.514688ms) from=127.0.0.1:52926
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:29.280922 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (292.592529ms) from=127.0.0.1:52916
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:29.289486 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (558.34743ms) from=127.0.0.1:46188
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:29.293348 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=e15c80b8-dfa7-7ad9-84fb-28722a10150b (391.179842ms) from=127.0.0.1:46194
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:29.321026 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (5.590798ms) from=127.0.0.1:46194
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:29.423110 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=710cab2e-d056-72bc-edb8-42a182739cbe (288.393764ms) from=127.0.0.1:37274
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:29.424002 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (317.21044ms) from=127.0.0.1:37268
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:29.425249 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (249.004507ms) from=127.0.0.1:58042
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:29.427303 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (321.698879ms) from=127.0.0.1:58032
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:29.430709 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (2.227052ms) from=127.0.0.1:37268
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:29.755305 [DEBUG] http: Request DELETE /v1/kv/test/prefix/5096e31a-6440-f112-6b7c-4f0c72a40c14 (474.751135ms) from=127.0.0.1:52926
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:29.758558 [DEBUG] http: Request PUT /v1/session/destroy/e15c80b8-dfa7-7ad9-84fb-28722a10150b (462.770521ms) from=127.0.0.1:46188
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:29.760817 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (1.201694ms) from=127.0.0.1:52926
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:29.879575 [DEBUG] http: Request DELETE /v1/kv/test/prefix/61b5b338-c0e6-9e2f-7314-cd36e2c9fee0 (451.259918ms) from=127.0.0.1:58042
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:29.884428 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (927.689µs) from=127.0.0.1:58042
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:29.962296 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (636.730268ms) from=127.0.0.1:46194
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:29.964429 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:29.964520 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:29.964587 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.126856 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.129137 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (365.602909ms) from=127.0.0.1:52926
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.131083 [DEBUG] http: Request PUT /v1/session/destroy/5096e31a-6440-f112-6b7c-4f0c72a40c14 (373.04375ms) from=127.0.0.1:52916
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.132136 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.132210 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.132256 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.135193 [DEBUG] http: Request PUT /v1/session/destroy/710cab2e-d056-72bc-edb8-42a182739cbe (703.979846ms) from=127.0.0.1:37274
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.136986 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (704.069181ms) from=127.0.0.1:37268
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.138347 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.138567 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.138721 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.142660 [DEBUG] http: Request PUT /v1/session/destroy/61b5b338-c0e6-9e2f-7314-cd36e2c9fee0 (257.867715ms) from=127.0.0.1:58032
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.210281 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.211136 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.211195 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.211251 [INFO] agent: Stopping DNS server 127.0.0.1:52019 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.211391 [INFO] agent: Stopping DNS server 127.0.0.1:52019 (udp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.211527 [INFO] agent: Stopping HTTP server 127.0.0.1:52020 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.212534 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/12/06 06:38:30.212624 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Lock_Arg (8.99s)
=== CONT  TestLockCommand_NoShell
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_NoShell - 2019/12/06 06:38:30.272437 [WARN] agent: Node name "Node d783b0be-762d-bc9b-e9b3-e637b4c97a1d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_NoShell - 2019/12/06 06:38:30.272925 [DEBUG] tlsutil: Update with version 1
TestLockCommand_NoShell - 2019/12/06 06:38:30.275498 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.276797 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.284813 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.377019 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.377006 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.377787 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.377851 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.377912 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.377948 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.377993 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.378051 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.378078 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (udp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.378240 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.378455 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.378256 [INFO] agent: Stopping HTTP server 127.0.0.1:52008 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.378635 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (491.29219ms) from=127.0.0.1:58042
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.379381 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.379461 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Lock_Default - 2019/12/06 06:38:30.379529 [INFO] agent: Endpoints down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/12/06 06:38:30.379577 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Lock_Default (9.16s)
=== CONT  TestLockCommand_TryLock
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.381275 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.381361 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.381412 [WARN] serf: Shutdown without a Leave
--- PASS: TestLockCommand_MonitorRetry_Semaphore_Arg (9.16s)
=== CONT  TestLockCommand_TrySemaphore
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_TryLock - 2019/12/06 06:38:30.440188 [WARN] agent: Node name "Node 2b8b3e0b-599b-a80b-8629-c90ceac4abe7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_TryLock - 2019/12/06 06:38:30.440638 [DEBUG] tlsutil: Update with version 1
TestLockCommand_TryLock - 2019/12/06 06:38:30.443956 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_TrySemaphore - 2019/12/06 06:38:30.452638 [WARN] agent: Node name "Node cc455ca4-bdbb-552a-5c18-b656e49038bc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_TrySemaphore - 2019/12/06 06:38:30.453125 [DEBUG] tlsutil: Update with version 1
TestLockCommand_TrySemaphore - 2019/12/06 06:38:30.455304 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.487613 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.680887 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.681658 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.681713 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.681769 [INFO] agent: Stopping DNS server 127.0.0.1:52013 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.681912 [INFO] agent: Stopping DNS server 127.0.0.1:52013 (udp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.682056 [INFO] agent: Stopping HTTP server 127.0.0.1:52014 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.682596 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/12/06 06:38:30.682729 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Semaphore_Default (9.44s)
=== CONT  TestLockCommand
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand - 2019/12/06 06:38:30.745560 [WARN] agent: Node name "Node bec52376-9a12-caf2-a646-fdd24853473f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand - 2019/12/06 06:38:30.746743 [DEBUG] tlsutil: Update with version 1
TestLockCommand - 2019/12/06 06:38:30.748920 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:38:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d783b0be-762d-bc9b-e9b3-e637b4c97a1d Address:127.0.0.1:52030}]
2019/12/06 06:38:31 [INFO]  raft: Node at 127.0.0.1:52030 [Follower] entering Follower state (Leader: "")
TestLockCommand_NoShell - 2019/12/06 06:38:31.333597 [INFO] serf: EventMemberJoin: Node d783b0be-762d-bc9b-e9b3-e637b4c97a1d.dc1 127.0.0.1
TestLockCommand_NoShell - 2019/12/06 06:38:31.338484 [INFO] serf: EventMemberJoin: Node d783b0be-762d-bc9b-e9b3-e637b4c97a1d 127.0.0.1
TestLockCommand_NoShell - 2019/12/06 06:38:31.340166 [INFO] consul: Adding LAN server Node d783b0be-762d-bc9b-e9b3-e637b4c97a1d (Addr: tcp/127.0.0.1:52030) (DC: dc1)
TestLockCommand_NoShell - 2019/12/06 06:38:31.340734 [INFO] consul: Handled member-join event for server "Node d783b0be-762d-bc9b-e9b3-e637b4c97a1d.dc1" in area "wan"
TestLockCommand_NoShell - 2019/12/06 06:38:31.342643 [INFO] agent: Started DNS server 127.0.0.1:52025 (tcp)
TestLockCommand_NoShell - 2019/12/06 06:38:31.342833 [INFO] agent: Started DNS server 127.0.0.1:52025 (udp)
TestLockCommand_NoShell - 2019/12/06 06:38:31.345431 [INFO] agent: Started HTTP server on 127.0.0.1:52026 (tcp)
TestLockCommand_NoShell - 2019/12/06 06:38:31.345535 [INFO] agent: started state syncer
2019/12/06 06:38:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:31 [INFO]  raft: Node at 127.0.0.1:52030 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2b8b3e0b-599b-a80b-8629-c90ceac4abe7 Address:127.0.0.1:52036}]
2019/12/06 06:38:31 [INFO]  raft: Node at 127.0.0.1:52036 [Follower] entering Follower state (Leader: "")
TestLockCommand_TryLock - 2019/12/06 06:38:31.432188 [INFO] serf: EventMemberJoin: Node 2b8b3e0b-599b-a80b-8629-c90ceac4abe7.dc1 127.0.0.1
TestLockCommand_TryLock - 2019/12/06 06:38:31.437426 [INFO] serf: EventMemberJoin: Node 2b8b3e0b-599b-a80b-8629-c90ceac4abe7 127.0.0.1
TestLockCommand_TryLock - 2019/12/06 06:38:31.438716 [INFO] consul: Adding LAN server Node 2b8b3e0b-599b-a80b-8629-c90ceac4abe7 (Addr: tcp/127.0.0.1:52036) (DC: dc1)
TestLockCommand_TryLock - 2019/12/06 06:38:31.438963 [INFO] consul: Handled member-join event for server "Node 2b8b3e0b-599b-a80b-8629-c90ceac4abe7.dc1" in area "wan"
TestLockCommand_TryLock - 2019/12/06 06:38:31.440956 [INFO] agent: Started DNS server 127.0.0.1:52031 (tcp)
TestLockCommand_TryLock - 2019/12/06 06:38:31.441517 [INFO] agent: Started DNS server 127.0.0.1:52031 (udp)
TestLockCommand_TryLock - 2019/12/06 06:38:31.443916 [INFO] agent: Started HTTP server on 127.0.0.1:52032 (tcp)
TestLockCommand_TryLock - 2019/12/06 06:38:31.444019 [INFO] agent: started state syncer
2019/12/06 06:38:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:31 [INFO]  raft: Node at 127.0.0.1:52036 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cc455ca4-bdbb-552a-5c18-b656e49038bc Address:127.0.0.1:52042}]
TestLockCommand_TrySemaphore - 2019/12/06 06:38:31.598272 [INFO] serf: EventMemberJoin: Node cc455ca4-bdbb-552a-5c18-b656e49038bc.dc1 127.0.0.1
2019/12/06 06:38:31 [INFO]  raft: Node at 127.0.0.1:52042 [Follower] entering Follower state (Leader: "")
TestLockCommand_TrySemaphore - 2019/12/06 06:38:31.602617 [INFO] serf: EventMemberJoin: Node cc455ca4-bdbb-552a-5c18-b656e49038bc 127.0.0.1
TestLockCommand_TrySemaphore - 2019/12/06 06:38:31.603146 [INFO] consul: Handled member-join event for server "Node cc455ca4-bdbb-552a-5c18-b656e49038bc.dc1" in area "wan"
TestLockCommand_TrySemaphore - 2019/12/06 06:38:31.603436 [INFO] consul: Adding LAN server Node cc455ca4-bdbb-552a-5c18-b656e49038bc (Addr: tcp/127.0.0.1:52042) (DC: dc1)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:31.603742 [INFO] agent: Started DNS server 127.0.0.1:52037 (tcp)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:31.603810 [INFO] agent: Started DNS server 127.0.0.1:52037 (udp)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:31.606263 [INFO] agent: Started HTTP server on 127.0.0.1:52038 (tcp)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:31.606366 [INFO] agent: started state syncer
2019/12/06 06:38:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:31 [INFO]  raft: Node at 127.0.0.1:52042 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bec52376-9a12-caf2-a646-fdd24853473f Address:127.0.0.1:52048}]
2019/12/06 06:38:31 [INFO]  raft: Node at 127.0.0.1:52048 [Follower] entering Follower state (Leader: "")
TestLockCommand - 2019/12/06 06:38:31.847957 [INFO] serf: EventMemberJoin: Node bec52376-9a12-caf2-a646-fdd24853473f.dc1 127.0.0.1
TestLockCommand - 2019/12/06 06:38:31.855707 [INFO] serf: EventMemberJoin: Node bec52376-9a12-caf2-a646-fdd24853473f 127.0.0.1
TestLockCommand - 2019/12/06 06:38:31.856892 [INFO] consul: Adding LAN server Node bec52376-9a12-caf2-a646-fdd24853473f (Addr: tcp/127.0.0.1:52048) (DC: dc1)
TestLockCommand - 2019/12/06 06:38:31.857121 [INFO] agent: Started DNS server 127.0.0.1:52043 (udp)
TestLockCommand - 2019/12/06 06:38:31.857190 [INFO] consul: Handled member-join event for server "Node bec52376-9a12-caf2-a646-fdd24853473f.dc1" in area "wan"
TestLockCommand - 2019/12/06 06:38:31.857470 [INFO] agent: Started DNS server 127.0.0.1:52043 (tcp)
TestLockCommand - 2019/12/06 06:38:31.859986 [INFO] agent: Started HTTP server on 127.0.0.1:52044 (tcp)
TestLockCommand - 2019/12/06 06:38:31.860149 [INFO] agent: started state syncer
2019/12/06 06:38:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:31 [INFO]  raft: Node at 127.0.0.1:52048 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:32 [INFO]  raft: Node at 127.0.0.1:52030 [Leader] entering Leader state
TestLockCommand_NoShell - 2019/12/06 06:38:32.036753 [INFO] consul: cluster leadership acquired
TestLockCommand_NoShell - 2019/12/06 06:38:32.037160 [INFO] consul: New leader elected: Node d783b0be-762d-bc9b-e9b3-e637b4c97a1d
2019/12/06 06:38:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:32 [INFO]  raft: Node at 127.0.0.1:52036 [Leader] entering Leader state
TestLockCommand_TryLock - 2019/12/06 06:38:32.119380 [INFO] consul: cluster leadership acquired
TestLockCommand_TryLock - 2019/12/06 06:38:32.119844 [INFO] consul: New leader elected: Node 2b8b3e0b-599b-a80b-8629-c90ceac4abe7
2019/12/06 06:38:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:32 [INFO]  raft: Node at 127.0.0.1:52042 [Leader] entering Leader state
TestLockCommand_TrySemaphore - 2019/12/06 06:38:32.221187 [INFO] consul: cluster leadership acquired
TestLockCommand_TrySemaphore - 2019/12/06 06:38:32.221613 [INFO] consul: New leader elected: Node cc455ca4-bdbb-552a-5c18-b656e49038bc
TestLockCommand_NoShell - 2019/12/06 06:38:32.379298 [INFO] agent: Synced node info
TestLockCommand_NoShell - 2019/12/06 06:38:32.379416 [DEBUG] agent: Node info in sync
2019/12/06 06:38:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:32 [INFO]  raft: Node at 127.0.0.1:52048 [Leader] entering Leader state
TestLockCommand_TryLock - 2019/12/06 06:38:32.477757 [INFO] agent: Synced node info
TestLockCommand - 2019/12/06 06:38:32.480162 [INFO] consul: cluster leadership acquired
TestLockCommand - 2019/12/06 06:38:32.480546 [INFO] consul: New leader elected: Node bec52376-9a12-caf2-a646-fdd24853473f
TestLockCommand_TrySemaphore - 2019/12/06 06:38:32.580677 [INFO] agent: Synced node info
TestLockCommand_TrySemaphore - 2019/12/06 06:38:32.580807 [DEBUG] agent: Node info in sync
TestLockCommand - 2019/12/06 06:38:33.037786 [INFO] agent: Synced node info
TestLockCommand_TrySemaphore - 2019/12/06 06:38:33.295030 [DEBUG] agent: Node info in sync
TestLockCommand_NoShell - 2019/12/06 06:38:33.440028 [DEBUG] agent: Node info in sync
TestLockCommand_TryLock - 2019/12/06 06:38:33.735831 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_NoShell - 2019/12/06 06:38:33.735831 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_TryLock - 2019/12/06 06:38:33.736294 [DEBUG] consul: Skipping self join check for "Node 2b8b3e0b-599b-a80b-8629-c90ceac4abe7" since the cluster is too small
TestLockCommand_NoShell - 2019/12/06 06:38:33.736329 [DEBUG] consul: Skipping self join check for "Node d783b0be-762d-bc9b-e9b3-e637b4c97a1d" since the cluster is too small
TestLockCommand_TryLock - 2019/12/06 06:38:33.736441 [INFO] consul: member 'Node 2b8b3e0b-599b-a80b-8629-c90ceac4abe7' joined, marking health alive
TestLockCommand_NoShell - 2019/12/06 06:38:33.736454 [INFO] consul: member 'Node d783b0be-762d-bc9b-e9b3-e637b4c97a1d' joined, marking health alive
TestLockCommand_TrySemaphore - 2019/12/06 06:38:34.085823 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_TrySemaphore - 2019/12/06 06:38:34.086247 [DEBUG] consul: Skipping self join check for "Node cc455ca4-bdbb-552a-5c18-b656e49038bc" since the cluster is too small
TestLockCommand_TrySemaphore - 2019/12/06 06:38:34.086384 [INFO] consul: member 'Node cc455ca4-bdbb-552a-5c18-b656e49038bc' joined, marking health alive
TestLockCommand_NoShell - 2019/12/06 06:38:34.108743 [DEBUG] http: Request GET /v1/agent/self (12.666297ms) from=127.0.0.1:49568
TestLockCommand_TryLock - 2019/12/06 06:38:34.116224 [DEBUG] http: Request GET /v1/agent/self (6.548487ms) from=127.0.0.1:54972
TestLockCommand - 2019/12/06 06:38:34.328322 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand - 2019/12/06 06:38:34.328760 [DEBUG] consul: Skipping self join check for "Node bec52376-9a12-caf2-a646-fdd24853473f" since the cluster is too small
TestLockCommand - 2019/12/06 06:38:34.328905 [INFO] consul: member 'Node bec52376-9a12-caf2-a646-fdd24853473f' joined, marking health alive
TestLockCommand_TrySemaphore - 2019/12/06 06:38:34.354677 [DEBUG] http: Request GET /v1/agent/self (7.591845ms) from=127.0.0.1:38020
TestLockCommand_TryLock - 2019/12/06 06:38:34.436924 [DEBUG] http: Request PUT /v1/session/create (300.355378ms) from=127.0.0.1:54972
TestLockCommand_NoShell - 2019/12/06 06:38:34.438051 [DEBUG] http: Request PUT /v1/session/create (311.354637ms) from=127.0.0.1:49568
TestLockCommand_NoShell - 2019/12/06 06:38:34.441295 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (236.339µs) from=127.0.0.1:49568
TestLockCommand_TryLock - 2019/12/06 06:38:34.441403 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=10000ms (197.004µs) from=127.0.0.1:54972
TestLockCommand - 2019/12/06 06:38:34.528777 [DEBUG] http: Request GET /v1/agent/self (7.587845ms) from=127.0.0.1:32972
TestLockCommand_TryLock - 2019/12/06 06:38:34.529732 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand_TryLock - 2019/12/06 06:38:34.530160 [DEBUG] agent: Node info in sync
TestLockCommand_TryLock - 2019/12/06 06:38:34.530266 [DEBUG] agent: Node info in sync
TestLockCommand_TrySemaphore - 2019/12/06 06:38:34.611917 [DEBUG] http: Request PUT /v1/session/create (245.871101ms) from=127.0.0.1:38020
TestLockCommand_NoShell - 2019/12/06 06:38:34.770895 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=db2787ab-9ed2-5e51-2d91-5de9a01c9721&flags=3304740253564472344 (328.295701ms) from=127.0.0.1:49568
TestLockCommand_NoShell - 2019/12/06 06:38:34.771457 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_NoShell - 2019/12/06 06:38:34.788907 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (1.617372ms) from=127.0.0.1:49568
TestLockCommand_TryLock - 2019/12/06 06:38:34.790099 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=9fe1aa1b-2129-ac96-113d-81ec9f649bf5&flags=3304740253564472344 (328.295701ms) from=127.0.0.1:54972
TestLockCommand_TryLock - 2019/12/06 06:38:34.791538 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand - 2019/12/06 06:38:34.808864 [DEBUG] http: Request PUT /v1/session/create (267.661612ms) from=127.0.0.1:32972
TestLockCommand - 2019/12/06 06:38:34.812742 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (321.341µs) from=127.0.0.1:32972
TestLockCommand_TryLock - 2019/12/06 06:38:34.813093 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (8.275527ms) from=127.0.0.1:54972
TestLockCommand_TrySemaphore - 2019/12/06 06:38:34.874600 [DEBUG] http: Request PUT /v1/kv/test/prefix/401cb618-72d5-4b0b-311d-8ff4d49ef660?acquire=401cb618-72d5-4b0b-311d-8ff4d49ef660&flags=16210313421097356768 (251.740905ms) from=127.0.0.1:38020
TestLockCommand_TrySemaphore - 2019/12/06 06:38:34.875674 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_TrySemaphore - 2019/12/06 06:38:34.878280 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=10000ms (1.048691ms) from=127.0.0.1:38020
TestLockCommand_NoShell - 2019/12/06 06:38:35.053941 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=db2787ab-9ed2-5e51-2d91-5de9a01c9721 (244.761408ms) from=127.0.0.1:49576
TestLockCommand_NoShell - 2019/12/06 06:38:35.055110 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (251.776239ms) from=127.0.0.1:49568
TestLockCommand_TryLock - 2019/12/06 06:38:35.058053 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=9fe1aa1b-2129-ac96-113d-81ec9f649bf5 (236.88389ms) from=127.0.0.1:54980
TestLockCommand - 2019/12/06 06:38:35.058094 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=ef5c153c-1fd2-24e8-c5d5-bac358004706&flags=3304740253564472344 (243.279039ms) from=127.0.0.1:32972
TestLockCommand_TrySemaphore - 2019/12/06 06:38:35.061476 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (178.286516ms) from=127.0.0.1:38020
TestLockCommand_TryLock - 2019/12/06 06:38:35.064643 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.648706ms) from=127.0.0.1:54980
TestLockCommand_NoShell - 2019/12/06 06:38:35.078944 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (12.104284ms) from=127.0.0.1:49576
TestLockCommand_TryLock - 2019/12/06 06:38:35.081228 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (264.659874ms) from=127.0.0.1:54972
TestLockCommand - 2019/12/06 06:38:35.099402 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (9.216549ms) from=127.0.0.1:32972
TestLockCommand_TrySemaphore - 2019/12/06 06:38:35.110289 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (10.040235ms) from=127.0.0.1:38020
TestLockCommand_TrySemaphore - 2019/12/06 06:38:35.163754 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.632372ms) from=127.0.0.1:38032
TestLockCommand - 2019/12/06 06:38:35.402542 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand - 2019/12/06 06:38:35.404154 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=ef5c153c-1fd2-24e8-c5d5-bac358004706 (281.06926ms) from=127.0.0.1:32980
TestLockCommand - 2019/12/06 06:38:35.405608 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (301.03206ms) from=127.0.0.1:32972
TestLockCommand_TrySemaphore - 2019/12/06 06:38:35.406667 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (239.504618ms) from=127.0.0.1:38032
TestLockCommand_TrySemaphore - 2019/12/06 06:38:35.410104 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (293.679222ms) from=127.0.0.1:38020
TestLockCommand - 2019/12/06 06:38:35.413737 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.163027ms) from=127.0.0.1:32980
TestLockCommand_TryLock - 2019/12/06 06:38:35.516545 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (433.58317ms) from=127.0.0.1:54980
TestLockCommand_TryLock - 2019/12/06 06:38:35.518186 [DEBUG] http: Request PUT /v1/session/destroy/9fe1aa1b-2129-ac96-113d-81ec9f649bf5 (454.283989ms) from=127.0.0.1:54982
TestLockCommand_TryLock - 2019/12/06 06:38:35.520091 [INFO] agent: Requesting shutdown
TestLockCommand_TryLock - 2019/12/06 06:38:35.520174 [INFO] consul: shutting down server
TestLockCommand_TryLock - 2019/12/06 06:38:35.520238 [WARN] serf: Shutdown without a Leave
TestLockCommand_TrySemaphore - 2019/12/06 06:38:35.612085 [DEBUG] http: Request DELETE /v1/kv/test/prefix/401cb618-72d5-4b0b-311d-8ff4d49ef660 (196.065933ms) from=127.0.0.1:38032
TestLockCommand_NoShell - 2019/12/06 06:38:35.613921 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (505.706195ms) from=127.0.0.1:49576
TestLockCommand_NoShell - 2019/12/06 06:38:35.614820 [DEBUG] http: Request PUT /v1/session/destroy/db2787ab-9ed2-5e51-2d91-5de9a01c9721 (521.84024ms) from=127.0.0.1:49568
TestLockCommand_TryLock - 2019/12/06 06:38:35.615767 [WARN] serf: Shutdown without a Leave
TestLockCommand - 2019/12/06 06:38:35.616479 [DEBUG] http: Request PUT /v1/session/destroy/ef5c153c-1fd2-24e8-c5d5-bac358004706 (204.825805ms) from=127.0.0.1:32972
TestLockCommand_TrySemaphore - 2019/12/06 06:38:35.619811 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (1.199028ms) from=127.0.0.1:38032
TestLockCommand_NoShell - 2019/12/06 06:38:35.622500 [INFO] agent: Requesting shutdown
TestLockCommand_NoShell - 2019/12/06 06:38:35.622595 [INFO] consul: shutting down server
TestLockCommand_NoShell - 2019/12/06 06:38:35.622667 [WARN] serf: Shutdown without a Leave
TestLockCommand - 2019/12/06 06:38:35.682958 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand - 2019/12/06 06:38:35.683044 [DEBUG] agent: Node info in sync
TestLockCommand - 2019/12/06 06:38:35.683127 [DEBUG] agent: Node info in sync
TestLockCommand_TryLock - 2019/12/06 06:38:35.772236 [INFO] manager: shutting down
TestLockCommand_NoShell - 2019/12/06 06:38:35.772847 [WARN] serf: Shutdown without a Leave
TestLockCommand_TryLock - 2019/12/06 06:38:35.773134 [INFO] agent: consul server down
TestLockCommand_TryLock - 2019/12/06 06:38:35.773182 [INFO] agent: shutdown complete
TestLockCommand_TryLock - 2019/12/06 06:38:35.773265 [INFO] agent: Stopping DNS server 127.0.0.1:52031 (tcp)
TestLockCommand_TryLock - 2019/12/06 06:38:35.773413 [INFO] agent: Stopping DNS server 127.0.0.1:52031 (udp)
TestLockCommand_TryLock - 2019/12/06 06:38:35.773552 [INFO] agent: Stopping HTTP server 127.0.0.1:52032 (tcp)
TestLockCommand_TryLock - 2019/12/06 06:38:35.774381 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_TryLock - 2019/12/06 06:38:35.774646 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_TryLock (5.40s)
=== CONT  TestLockCommand_BadArgs
--- PASS: TestLockCommand_BadArgs (0.01s)
TestLockCommand - 2019/12/06 06:38:36.039499 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (620.848562ms) from=127.0.0.1:32980
TestLockCommand - 2019/12/06 06:38:36.048353 [INFO] agent: Requesting shutdown
TestLockCommand - 2019/12/06 06:38:36.048470 [INFO] consul: shutting down server
TestLockCommand - 2019/12/06 06:38:36.048541 [WARN] serf: Shutdown without a Leave
TestLockCommand_NoShell - 2019/12/06 06:38:36.068546 [INFO] manager: shutting down
TestLockCommand_NoShell - 2019/12/06 06:38:36.084492 [INFO] agent: consul server down
TestLockCommand_NoShell - 2019/12/06 06:38:36.084631 [INFO] agent: shutdown complete
TestLockCommand_NoShell - 2019/12/06 06:38:36.084695 [INFO] agent: Stopping DNS server 127.0.0.1:52025 (tcp)
TestLockCommand_NoShell - 2019/12/06 06:38:36.084874 [INFO] agent: Stopping DNS server 127.0.0.1:52025 (udp)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.071795 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (448.945197ms) from=127.0.0.1:38032
TestLockCommand_NoShell - 2019/12/06 06:38:36.085040 [INFO] agent: Stopping HTTP server 127.0.0.1:52026 (tcp)
TestLockCommand_NoShell - 2019/12/06 06:38:36.085721 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_NoShell - 2019/12/06 06:38:36.085905 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_NoShell (5.87s)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.086638 [INFO] agent: Requesting shutdown
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.086718 [INFO] consul: shutting down server
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.086772 [WARN] serf: Shutdown without a Leave
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.302303 [WARN] serf: Shutdown without a Leave
TestLockCommand - 2019/12/06 06:38:36.304680 [WARN] serf: Shutdown without a Leave
TestLockCommand - 2019/12/06 06:38:36.402074 [INFO] manager: shutting down
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.402074 [INFO] manager: shutting down
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.402090 [ERR] consul.session: Apply failed: leadership lost while committing log
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.402281 [ERR] http: Request PUT /v1/session/destroy/401cb618-72d5-4b0b-311d-8ff4d49ef660, error: leadership lost while committing log from=127.0.0.1:38020
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.403017 [INFO] agent: consul server down
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.403127 [INFO] agent: shutdown complete
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.403202 [INFO] agent: Stopping DNS server 127.0.0.1:52037 (tcp)
TestLockCommand - 2019/12/06 06:38:36.403016 [INFO] agent: consul server down
TestLockCommand - 2019/12/06 06:38:36.403267 [INFO] agent: shutdown complete
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.403211 [DEBUG] http: Request PUT /v1/session/destroy/401cb618-72d5-4b0b-311d-8ff4d49ef660 (776.119204ms) from=127.0.0.1:38020
TestLockCommand - 2019/12/06 06:38:36.403331 [INFO] agent: Stopping DNS server 127.0.0.1:52043 (tcp)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.403462 [INFO] agent: Stopping DNS server 127.0.0.1:52037 (udp)
TestLockCommand - 2019/12/06 06:38:36.403491 [INFO] agent: Stopping DNS server 127.0.0.1:52043 (udp)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.403657 [INFO] agent: Stopping HTTP server 127.0.0.1:52038 (tcp)
TestLockCommand - 2019/12/06 06:38:36.403658 [INFO] agent: Stopping HTTP server 127.0.0.1:52044 (tcp)
TestLockCommand - 2019/12/06 06:38:36.404497 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand - 2019/12/06 06:38:36.404617 [INFO] agent: Endpoints down
--- PASS: TestLockCommand (5.72s)
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.904469 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_TrySemaphore - 2019/12/06 06:38:36.904599 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_TrySemaphore (6.52s)
PASS
ok  	github.com/hashicorp/consul/command/lock	15.990s
=== RUN   TestLoginCommand_noTabs
=== PAUSE TestLoginCommand_noTabs
=== RUN   TestLoginCommand
=== PAUSE TestLoginCommand
=== RUN   TestLoginCommand_k8s
=== PAUSE TestLoginCommand_k8s
=== CONT  TestLoginCommand_noTabs
=== CONT  TestLoginCommand_k8s
=== CONT  TestLoginCommand
--- PASS: TestLoginCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestLoginCommand - 2019/12/06 06:38:25.187949 [WARN] agent: Node name "Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLoginCommand - 2019/12/06 06:38:25.191270 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLoginCommand_k8s - 2019/12/06 06:38:25.197262 [WARN] agent: Node name "Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLoginCommand_k8s - 2019/12/06 06:38:25.197805 [DEBUG] tlsutil: Update with version 1
TestLoginCommand_k8s - 2019/12/06 06:38:25.202731 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLoginCommand - 2019/12/06 06:38:25.205100 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:38:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142 Address:127.0.0.1:28012}]
2019/12/06 06:38:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5fde79c7-06fb-7c9b-3991-f8c9235b07c3 Address:127.0.0.1:28006}]
2019/12/06 06:38:27 [INFO]  raft: Node at 127.0.0.1:28006 [Follower] entering Follower state (Leader: "")
2019/12/06 06:38:27 [INFO]  raft: Node at 127.0.0.1:28012 [Follower] entering Follower state (Leader: "")
TestLoginCommand - 2019/12/06 06:38:27.800710 [INFO] serf: EventMemberJoin: Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142.dc1 127.0.0.1
TestLoginCommand_k8s - 2019/12/06 06:38:27.802244 [INFO] serf: EventMemberJoin: Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3.dc1 127.0.0.1
TestLoginCommand_k8s - 2019/12/06 06:38:27.826600 [INFO] serf: EventMemberJoin: Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3 127.0.0.1
TestLoginCommand_k8s - 2019/12/06 06:38:27.827930 [INFO] consul: Adding LAN server Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3 (Addr: tcp/127.0.0.1:28006) (DC: dc1)
TestLoginCommand_k8s - 2019/12/06 06:38:27.828058 [INFO] consul: Handled member-join event for server "Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3.dc1" in area "wan"
TestLoginCommand_k8s - 2019/12/06 06:38:27.828572 [INFO] agent: Started DNS server 127.0.0.1:28001 (tcp)
2019/12/06 06:38:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:27 [INFO]  raft: Node at 127.0.0.1:28012 [Candidate] entering Candidate state in term 2
TestLoginCommand_k8s - 2019/12/06 06:38:27.830350 [INFO] agent: Started DNS server 127.0.0.1:28001 (udp)
TestLoginCommand_k8s - 2019/12/06 06:38:27.833732 [INFO] agent: Started HTTP server on 127.0.0.1:28002 (tcp)
TestLoginCommand_k8s - 2019/12/06 06:38:27.833908 [INFO] agent: started state syncer
TestLoginCommand - 2019/12/06 06:38:27.834049 [INFO] serf: EventMemberJoin: Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142 127.0.0.1
TestLoginCommand - 2019/12/06 06:38:27.835104 [INFO] consul: Adding LAN server Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142 (Addr: tcp/127.0.0.1:28012) (DC: dc1)
TestLoginCommand - 2019/12/06 06:38:27.836339 [INFO] agent: Started DNS server 127.0.0.1:28007 (udp)
TestLoginCommand - 2019/12/06 06:38:27.836803 [INFO] consul: Handled member-join event for server "Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142.dc1" in area "wan"
TestLoginCommand - 2019/12/06 06:38:27.839756 [INFO] agent: Started DNS server 127.0.0.1:28007 (tcp)
TestLoginCommand - 2019/12/06 06:38:27.842357 [INFO] agent: Started HTTP server on 127.0.0.1:28008 (tcp)
TestLoginCommand - 2019/12/06 06:38:27.842487 [INFO] agent: started state syncer
2019/12/06 06:38:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:27 [INFO]  raft: Node at 127.0.0.1:28006 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:28 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:28 [INFO]  raft: Node at 127.0.0.1:28012 [Leader] entering Leader state
TestLoginCommand - 2019/12/06 06:38:28.522576 [INFO] consul: cluster leadership acquired
TestLoginCommand - 2019/12/06 06:38:28.523174 [INFO] consul: New leader elected: Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142
TestLoginCommand - 2019/12/06 06:38:28.588649 [INFO] acl: initializing acls
2019/12/06 06:38:28 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:28 [INFO]  raft: Node at 127.0.0.1:28006 [Leader] entering Leader state
TestLoginCommand_k8s - 2019/12/06 06:38:28.699425 [INFO] consul: cluster leadership acquired
TestLoginCommand_k8s - 2019/12/06 06:38:28.699843 [INFO] consul: New leader elected: Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3
TestLoginCommand - 2019/12/06 06:38:28.709341 [ERR] agent: failed to sync remote state: ACL not found
TestLoginCommand - 2019/12/06 06:38:28.918763 [ERR] agent: failed to sync remote state: ACL not found
TestLoginCommand_k8s - 2019/12/06 06:38:28.993343 [ERR] agent: failed to sync remote state: ACL not found
TestLoginCommand - 2019/12/06 06:38:29.079079 [INFO] consul: Created ACL 'global-management' policy
TestLoginCommand - 2019/12/06 06:38:29.079240 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLoginCommand - 2019/12/06 06:38:29.081300 [INFO] acl: initializing acls
TestLoginCommand - 2019/12/06 06:38:29.081529 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLoginCommand_k8s - 2019/12/06 06:38:29.379778 [INFO] acl: initializing acls
TestLoginCommand_k8s - 2019/12/06 06:38:29.420938 [INFO] acl: initializing acls
TestLoginCommand - 2019/12/06 06:38:29.962450 [INFO] consul: Bootstrapped ACL master token from configuration
TestLoginCommand - 2019/12/06 06:38:29.966540 [INFO] consul: Bootstrapped ACL master token from configuration
TestLoginCommand_k8s - 2019/12/06 06:38:30.132918 [INFO] consul: Created ACL 'global-management' policy
TestLoginCommand_k8s - 2019/12/06 06:38:30.133020 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLoginCommand_k8s - 2019/12/06 06:38:30.134024 [INFO] consul: Created ACL 'global-management' policy
TestLoginCommand_k8s - 2019/12/06 06:38:30.134121 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLoginCommand - 2019/12/06 06:38:30.278810 [INFO] consul: Created ACL anonymous token from configuration
TestLoginCommand - 2019/12/06 06:38:30.279907 [INFO] serf: EventMemberUpdate: Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142
TestLoginCommand - 2019/12/06 06:38:30.280954 [INFO] serf: EventMemberUpdate: Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142.dc1
TestLoginCommand_k8s - 2019/12/06 06:38:30.491163 [INFO] consul: Bootstrapped ACL master token from configuration
TestLoginCommand - 2019/12/06 06:38:30.770661 [INFO] consul: Created ACL anonymous token from configuration
TestLoginCommand - 2019/12/06 06:38:30.770741 [DEBUG] acl: transitioning out of legacy ACL mode
TestLoginCommand - 2019/12/06 06:38:30.771712 [INFO] serf: EventMemberUpdate: Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142
TestLoginCommand - 2019/12/06 06:38:30.772361 [INFO] serf: EventMemberUpdate: Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142.dc1
TestLoginCommand_k8s - 2019/12/06 06:38:30.953488 [INFO] consul: Created ACL anonymous token from configuration
TestLoginCommand_k8s - 2019/12/06 06:38:30.954413 [INFO] consul: Bootstrapped ACL master token from configuration
TestLoginCommand_k8s - 2019/12/06 06:38:30.954565 [DEBUG] acl: transitioning out of legacy ACL mode
TestLoginCommand_k8s - 2019/12/06 06:38:30.954755 [INFO] serf: EventMemberUpdate: Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3
TestLoginCommand_k8s - 2019/12/06 06:38:30.955398 [INFO] serf: EventMemberUpdate: Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3.dc1
TestLoginCommand_k8s - 2019/12/06 06:38:30.955707 [INFO] serf: EventMemberUpdate: Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3
TestLoginCommand_k8s - 2019/12/06 06:38:30.956425 [INFO] serf: EventMemberUpdate: Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3.dc1
TestLoginCommand - 2019/12/06 06:38:31.595990 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLoginCommand - 2019/12/06 06:38:31.596527 [DEBUG] consul: Skipping self join check for "Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142" since the cluster is too small
TestLoginCommand - 2019/12/06 06:38:31.596630 [INFO] consul: member 'Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142' joined, marking health alive
TestLoginCommand_k8s - 2019/12/06 06:38:31.844953 [INFO] agent: Synced node info
TestLoginCommand_k8s - 2019/12/06 06:38:31.845074 [DEBUG] agent: Node info in sync
TestLoginCommand - 2019/12/06 06:38:31.846883 [DEBUG] consul: Skipping self join check for "Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142" since the cluster is too small
TestLoginCommand - 2019/12/06 06:38:31.847318 [DEBUG] consul: Skipping self join check for "Node aa5b38ea-7b9e-b920-8fbd-ab1dce2d0142" since the cluster is too small
=== RUN   TestLoginCommand/method_is_required
=== RUN   TestLoginCommand/token-sink-file_is_required
=== RUN   TestLoginCommand/bearer-token-file_is_required
=== RUN   TestLoginCommand/bearer-token-file_is_empty
=== RUN   TestLoginCommand/try_login_with_no_method_configured
TestLoginCommand - 2019/12/06 06:38:31.888856 [ERR] http: Request POST /v1/acl/login, error: ACL not found from=127.0.0.1:44288
TestLoginCommand - 2019/12/06 06:38:31.889851 [DEBUG] http: Request POST /v1/acl/login (1.852044ms) from=127.0.0.1:44288
TestLoginCommand - 2019/12/06 06:38:32.125405 [DEBUG] http: Request PUT /v1/acl/auth-method (202.809091ms) from=127.0.0.1:44290
=== RUN   TestLoginCommand/try_login_with_method_configured_but_no_binding_rules
TestLoginCommand - 2019/12/06 06:38:32.153534 [DEBUG] acl: updating cached auth method validator for "test"
TestLoginCommand - 2019/12/06 06:38:32.153837 [ERR] http: Request POST /v1/acl/login, error: Permission denied from=127.0.0.1:44292
TestLoginCommand - 2019/12/06 06:38:32.154557 [DEBUG] http: Request POST /v1/acl/login (1.767708ms) from=127.0.0.1:44292
TestLoginCommand_k8s - 2019/12/06 06:38:32.378309 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLoginCommand_k8s - 2019/12/06 06:38:32.380460 [DEBUG] consul: Skipping self join check for "Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3" since the cluster is too small
TestLoginCommand_k8s - 2019/12/06 06:38:32.381574 [INFO] consul: member 'Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3' joined, marking health alive
TestLoginCommand_k8s - 2019/12/06 06:38:32.385388 [DEBUG] http: Request PUT /v1/acl/auth-method (507.445569ms) from=127.0.0.1:58230
TestLoginCommand_k8s - 2019/12/06 06:38:32.397216 [DEBUG] acl: updating cached auth method validator for "k8s"
TestLoginCommand - 2019/12/06 06:38:32.400259 [DEBUG] http: Request PUT /v1/acl/binding-rule (242.344685ms) from=127.0.0.1:44290
=== RUN   TestLoginCommand/try_login_with_method_configured_and_binding_rules
TestLoginCommand - 2019/12/06 06:38:32.765871 [DEBUG] http: Request POST /v1/acl/login (356.978706ms) from=127.0.0.1:44294
TestLoginCommand - 2019/12/06 06:38:32.773359 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLoginCommand_k8s - 2019/12/06 06:38:32.856795 [DEBUG] http: Request PUT /v1/acl/binding-rule (465.294914ms) from=127.0.0.1:58230
TestLoginCommand_k8s - 2019/12/06 06:38:32.858430 [DEBUG] consul: Skipping self join check for "Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3" since the cluster is too small
=== RUN   TestLoginCommand_k8s/try_login_with_method_configured_and_binding_rules
TestLoginCommand_k8s - 2019/12/06 06:38:32.859156 [DEBUG] consul: Skipping self join check for "Node 5fde79c7-06fb-7c9b-3991-f8c9235b07c3" since the cluster is too small
TestLoginCommand - 2019/12/06 06:38:32.945572 [INFO] agent: Requesting shutdown
TestLoginCommand - 2019/12/06 06:38:32.945669 [INFO] consul: shutting down server
TestLoginCommand - 2019/12/06 06:38:32.945714 [WARN] serf: Shutdown without a Leave
TestLoginCommand - 2019/12/06 06:38:33.035291 [WARN] serf: Shutdown without a Leave
TestLoginCommand - 2019/12/06 06:38:33.110264 [INFO] manager: shutting down
TestLoginCommand - 2019/12/06 06:38:33.111038 [INFO] agent: consul server down
TestLoginCommand - 2019/12/06 06:38:33.111101 [INFO] agent: shutdown complete
TestLoginCommand - 2019/12/06 06:38:33.111159 [INFO] agent: Stopping DNS server 127.0.0.1:28007 (tcp)
TestLoginCommand - 2019/12/06 06:38:33.111313 [INFO] agent: Stopping DNS server 127.0.0.1:28007 (udp)
TestLoginCommand - 2019/12/06 06:38:33.111456 [INFO] agent: Stopping HTTP server 127.0.0.1:28008 (tcp)
TestLoginCommand - 2019/12/06 06:38:33.112550 [INFO] agent: Waiting for endpoints to shut down
TestLoginCommand - 2019/12/06 06:38:33.112831 [INFO] agent: Endpoints down
--- PASS: TestLoginCommand (8.05s)
    --- PASS: TestLoginCommand/method_is_required (0.00s)
    --- PASS: TestLoginCommand/token-sink-file_is_required (0.00s)
    --- PASS: TestLoginCommand/bearer-token-file_is_required (0.00s)
    --- PASS: TestLoginCommand/bearer-token-file_is_empty (0.00s)
    --- PASS: TestLoginCommand/try_login_with_no_method_configured (0.01s)
    --- PASS: TestLoginCommand/try_login_with_method_configured_but_no_binding_rules (0.03s)
    --- PASS: TestLoginCommand/try_login_with_method_configured_and_binding_rules (0.54s)
TestLoginCommand_k8s - 2019/12/06 06:38:33.280334 [DEBUG] http: Request POST /v1/acl/login (393.533897ms) from=127.0.0.1:58240
TestLoginCommand_k8s - 2019/12/06 06:38:33.281856 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLoginCommand_k8s - 2019/12/06 06:38:33.391932 [INFO] agent: Requesting shutdown
TestLoginCommand_k8s - 2019/12/06 06:38:33.392022 [INFO] consul: shutting down server
TestLoginCommand_k8s - 2019/12/06 06:38:33.392072 [WARN] serf: Shutdown without a Leave
TestLoginCommand_k8s - 2019/12/06 06:38:33.576885 [WARN] serf: Shutdown without a Leave
TestLoginCommand_k8s - 2019/12/06 06:38:33.660286 [INFO] manager: shutting down
TestLoginCommand_k8s - 2019/12/06 06:38:33.661076 [INFO] agent: consul server down
TestLoginCommand_k8s - 2019/12/06 06:38:33.661139 [INFO] agent: shutdown complete
TestLoginCommand_k8s - 2019/12/06 06:38:33.661201 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (tcp)
TestLoginCommand_k8s - 2019/12/06 06:38:33.661365 [INFO] agent: Stopping DNS server 127.0.0.1:28001 (udp)
TestLoginCommand_k8s - 2019/12/06 06:38:33.661550 [INFO] agent: Stopping HTTP server 127.0.0.1:28002 (tcp)
TestLoginCommand_k8s - 2019/12/06 06:38:33.662195 [INFO] agent: Waiting for endpoints to shut down
TestLoginCommand_k8s - 2019/12/06 06:38:33.662313 [INFO] agent: Endpoints down
--- PASS: TestLoginCommand_k8s (8.59s)
    --- PASS: TestLoginCommand_k8s/try_login_with_method_configured_and_binding_rules (0.53s)
PASS
ok  	github.com/hashicorp/consul/command/login	8.887s
=== RUN   TestLogout_noTabs
=== PAUSE TestLogout_noTabs
=== RUN   TestLogoutCommand
=== PAUSE TestLogoutCommand
=== RUN   TestLogoutCommand_k8s
=== PAUSE TestLogoutCommand_k8s
=== CONT  TestLogout_noTabs
=== CONT  TestLogoutCommand
--- PASS: TestLogout_noTabs (0.00s)
=== CONT  TestLogoutCommand_k8s
WARNING: bootstrap = true: do not enable unless necessary
TestLogoutCommand - 2019/12/06 06:38:34.379046 [WARN] agent: Node name "Node 4e972c05-4288-aa3e-a614-b27175361fdf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLogoutCommand - 2019/12/06 06:38:34.380207 [DEBUG] tlsutil: Update with version 1
TestLogoutCommand - 2019/12/06 06:38:34.388071 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLogoutCommand_k8s - 2019/12/06 06:38:34.414599 [WARN] agent: Node name "Node b0b2713f-855f-59cb-b908-116b9f3736f0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLogoutCommand_k8s - 2019/12/06 06:38:34.415045 [DEBUG] tlsutil: Update with version 1
TestLogoutCommand_k8s - 2019/12/06 06:38:34.423776 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:38:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4e972c05-4288-aa3e-a614-b27175361fdf Address:127.0.0.1:14506}]
2019/12/06 06:38:35 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestLogoutCommand - 2019/12/06 06:38:35.518065 [INFO] serf: EventMemberJoin: Node 4e972c05-4288-aa3e-a614-b27175361fdf.dc1 127.0.0.1
TestLogoutCommand - 2019/12/06 06:38:35.528433 [INFO] serf: EventMemberJoin: Node 4e972c05-4288-aa3e-a614-b27175361fdf 127.0.0.1
TestLogoutCommand - 2019/12/06 06:38:35.533316 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestLogoutCommand - 2019/12/06 06:38:35.534545 [INFO] consul: Adding LAN server Node 4e972c05-4288-aa3e-a614-b27175361fdf (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestLogoutCommand - 2019/12/06 06:38:35.534923 [INFO] consul: Handled member-join event for server "Node 4e972c05-4288-aa3e-a614-b27175361fdf.dc1" in area "wan"
TestLogoutCommand - 2019/12/06 06:38:35.535602 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestLogoutCommand - 2019/12/06 06:38:35.538829 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestLogoutCommand - 2019/12/06 06:38:35.539168 [INFO] agent: started state syncer
2019/12/06 06:38:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:35 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b0b2713f-855f-59cb-b908-116b9f3736f0 Address:127.0.0.1:14512}]
2019/12/06 06:38:35 [INFO]  raft: Node at 127.0.0.1:14512 [Follower] entering Follower state (Leader: "")
TestLogoutCommand_k8s - 2019/12/06 06:38:35.618471 [INFO] serf: EventMemberJoin: Node b0b2713f-855f-59cb-b908-116b9f3736f0.dc1 127.0.0.1
TestLogoutCommand_k8s - 2019/12/06 06:38:35.645931 [INFO] serf: EventMemberJoin: Node b0b2713f-855f-59cb-b908-116b9f3736f0 127.0.0.1
TestLogoutCommand_k8s - 2019/12/06 06:38:35.652101 [INFO] agent: Started DNS server 127.0.0.1:14507 (udp)
TestLogoutCommand_k8s - 2019/12/06 06:38:35.654510 [INFO] consul: Handled member-join event for server "Node b0b2713f-855f-59cb-b908-116b9f3736f0.dc1" in area "wan"
TestLogoutCommand_k8s - 2019/12/06 06:38:35.654739 [INFO] consul: Adding LAN server Node b0b2713f-855f-59cb-b908-116b9f3736f0 (Addr: tcp/127.0.0.1:14512) (DC: dc1)
TestLogoutCommand_k8s - 2019/12/06 06:38:35.656143 [INFO] agent: Started DNS server 127.0.0.1:14507 (tcp)
TestLogoutCommand_k8s - 2019/12/06 06:38:35.660223 [INFO] agent: Started HTTP server on 127.0.0.1:14508 (tcp)
TestLogoutCommand_k8s - 2019/12/06 06:38:35.660389 [INFO] agent: started state syncer
2019/12/06 06:38:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:35 [INFO]  raft: Node at 127.0.0.1:14512 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:36 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestLogoutCommand - 2019/12/06 06:38:36.670239 [INFO] consul: cluster leadership acquired
TestLogoutCommand - 2019/12/06 06:38:36.670933 [INFO] consul: New leader elected: Node 4e972c05-4288-aa3e-a614-b27175361fdf
TestLogoutCommand - 2019/12/06 06:38:36.795086 [ERR] agent: failed to sync remote state: ACL not found
2019/12/06 06:38:36 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:36 [INFO]  raft: Node at 127.0.0.1:14512 [Leader] entering Leader state
TestLogoutCommand_k8s - 2019/12/06 06:38:36.920805 [INFO] consul: cluster leadership acquired
TestLogoutCommand_k8s - 2019/12/06 06:38:36.921285 [INFO] consul: New leader elected: Node b0b2713f-855f-59cb-b908-116b9f3736f0
TestLogoutCommand - 2019/12/06 06:38:37.087075 [INFO] acl: initializing acls
TestLogoutCommand_k8s - 2019/12/06 06:38:37.140004 [ERR] agent: failed to sync remote state: ACL not found
TestLogoutCommand_k8s - 2019/12/06 06:38:37.152764 [ERR] agent: failed to sync remote state: ACL not found
TestLogoutCommand_k8s - 2019/12/06 06:38:37.208437 [INFO] acl: initializing acls
TestLogoutCommand - 2019/12/06 06:38:37.606527 [ERR] agent: failed to sync remote state: ACL not found
TestLogoutCommand_k8s - 2019/12/06 06:38:37.861372 [INFO] consul: Created ACL 'global-management' policy
TestLogoutCommand_k8s - 2019/12/06 06:38:37.861472 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLogoutCommand_k8s - 2019/12/06 06:38:37.863162 [INFO] acl: initializing acls
TestLogoutCommand_k8s - 2019/12/06 06:38:37.863295 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLogoutCommand - 2019/12/06 06:38:38.025019 [INFO] acl: initializing acls
TestLogoutCommand - 2019/12/06 06:38:38.025410 [INFO] consul: Created ACL 'global-management' policy
TestLogoutCommand - 2019/12/06 06:38:38.025470 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLogoutCommand - 2019/12/06 06:38:38.494922 [INFO] consul: Bootstrapped ACL master token from configuration
TestLogoutCommand_k8s - 2019/12/06 06:38:38.495058 [INFO] consul: Bootstrapped ACL master token from configuration
TestLogoutCommand_k8s - 2019/12/06 06:38:38.495211 [INFO] consul: Bootstrapped ACL master token from configuration
TestLogoutCommand - 2019/12/06 06:38:38.495329 [INFO] consul: Created ACL 'global-management' policy
TestLogoutCommand - 2019/12/06 06:38:38.495382 [WARN] consul: Configuring a non-UUID master token is deprecated
TestLogoutCommand - 2019/12/06 06:38:38.661705 [INFO] consul: Created ACL anonymous token from configuration
TestLogoutCommand - 2019/12/06 06:38:38.662170 [DEBUG] acl: transitioning out of legacy ACL mode
TestLogoutCommand - 2019/12/06 06:38:38.663390 [INFO] serf: EventMemberUpdate: Node 4e972c05-4288-aa3e-a614-b27175361fdf
TestLogoutCommand - 2019/12/06 06:38:38.665850 [INFO] serf: EventMemberUpdate: Node 4e972c05-4288-aa3e-a614-b27175361fdf.dc1
TestLogoutCommand_k8s - 2019/12/06 06:38:38.937518 [INFO] consul: Created ACL anonymous token from configuration
TestLogoutCommand_k8s - 2019/12/06 06:38:38.938629 [INFO] serf: EventMemberUpdate: Node b0b2713f-855f-59cb-b908-116b9f3736f0
TestLogoutCommand_k8s - 2019/12/06 06:38:38.939759 [INFO] consul: Created ACL anonymous token from configuration
TestLogoutCommand_k8s - 2019/12/06 06:38:38.939849 [DEBUG] acl: transitioning out of legacy ACL mode
TestLogoutCommand_k8s - 2019/12/06 06:38:38.940261 [INFO] serf: EventMemberUpdate: Node b0b2713f-855f-59cb-b908-116b9f3736f0.dc1
TestLogoutCommand_k8s - 2019/12/06 06:38:38.940634 [INFO] serf: EventMemberUpdate: Node b0b2713f-855f-59cb-b908-116b9f3736f0
TestLogoutCommand_k8s - 2019/12/06 06:38:38.941343 [INFO] serf: EventMemberUpdate: Node b0b2713f-855f-59cb-b908-116b9f3736f0.dc1
TestLogoutCommand - 2019/12/06 06:38:39.019955 [INFO] consul: Created ACL anonymous token from configuration
TestLogoutCommand - 2019/12/06 06:38:39.020822 [INFO] serf: EventMemberUpdate: Node 4e972c05-4288-aa3e-a614-b27175361fdf
TestLogoutCommand - 2019/12/06 06:38:39.021538 [INFO] serf: EventMemberUpdate: Node 4e972c05-4288-aa3e-a614-b27175361fdf.dc1
TestLogoutCommand_k8s - 2019/12/06 06:38:40.452887 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLogoutCommand_k8s - 2019/12/06 06:38:40.453395 [DEBUG] consul: Skipping self join check for "Node b0b2713f-855f-59cb-b908-116b9f3736f0" since the cluster is too small
TestLogoutCommand_k8s - 2019/12/06 06:38:40.453503 [INFO] consul: member 'Node b0b2713f-855f-59cb-b908-116b9f3736f0' joined, marking health alive
TestLogoutCommand - 2019/12/06 06:38:40.544673 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLogoutCommand - 2019/12/06 06:38:40.545159 [DEBUG] consul: Skipping self join check for "Node 4e972c05-4288-aa3e-a614-b27175361fdf" since the cluster is too small
TestLogoutCommand - 2019/12/06 06:38:40.545256 [INFO] consul: member 'Node 4e972c05-4288-aa3e-a614-b27175361fdf' joined, marking health alive
TestLogoutCommand_k8s - 2019/12/06 06:38:40.771433 [DEBUG] consul: Skipping self join check for "Node b0b2713f-855f-59cb-b908-116b9f3736f0" since the cluster is too small
TestLogoutCommand_k8s - 2019/12/06 06:38:40.772058 [DEBUG] consul: Skipping self join check for "Node b0b2713f-855f-59cb-b908-116b9f3736f0" since the cluster is too small
=== RUN   TestLogoutCommand_k8s/no_token_specified
TestLogoutCommand_k8s - 2019/12/06 06:38:40.792515 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:37154
TestLogoutCommand_k8s - 2019/12/06 06:38:40.808169 [DEBUG] http: Request POST /v1/acl/logout (15.6337ms) from=127.0.0.1:37154
=== RUN   TestLogoutCommand_k8s/logout_of_deleted_token
TestLogoutCommand - 2019/12/06 06:38:40.831089 [DEBUG] consul: Skipping self join check for "Node 4e972c05-4288-aa3e-a614-b27175361fdf" since the cluster is too small
TestLogoutCommand - 2019/12/06 06:38:40.831998 [DEBUG] consul: Skipping self join check for "Node 4e972c05-4288-aa3e-a614-b27175361fdf" since the cluster is too small
TestLogoutCommand_k8s - 2019/12/06 06:38:40.844037 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:37156
TestLogoutCommand_k8s - 2019/12/06 06:38:40.845492 [DEBUG] http: Request POST /v1/acl/logout (1.82371ms) from=127.0.0.1:37156
=== RUN   TestLogoutCommand/no_token_specified
TestLogoutCommand - 2019/12/06 06:38:40.895340 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:41024
TestLogoutCommand - 2019/12/06 06:38:40.896366 [DEBUG] http: Request POST /v1/acl/logout (1.042692ms) from=127.0.0.1:41024
=== RUN   TestLogoutCommand/logout_of_deleted_token
TestLogoutCommand - 2019/12/06 06:38:40.902088 [ERR] http: Request POST /v1/acl/logout, error: ACL not found from=127.0.0.1:41028
TestLogoutCommand - 2019/12/06 06:38:40.903581 [DEBUG] http: Request POST /v1/acl/logout (1.654372ms) from=127.0.0.1:41028
TestLogoutCommand - 2019/12/06 06:38:41.067046 [DEBUG] http: Request PUT /v1/acl/token (157.430693ms) from=127.0.0.1:41030
TestLogoutCommand_k8s - 2019/12/06 06:38:41.067883 [DEBUG] http: Request PUT /v1/acl/token (170.348329ms) from=127.0.0.1:37160
=== RUN   TestLogoutCommand_k8s/logout_of_ordinary_token
TestLogoutCommand_k8s - 2019/12/06 06:38:41.076292 [ERR] http: Request POST /v1/acl/logout, error: Permission denied from=127.0.0.1:37166
TestLogoutCommand_k8s - 2019/12/06 06:38:41.077986 [DEBUG] http: Request POST /v1/acl/logout (1.856377ms) from=127.0.0.1:37166
=== RUN   TestLogoutCommand/logout_of_ordinary_token
TestLogoutCommand - 2019/12/06 06:38:41.098878 [ERR] http: Request POST /v1/acl/logout, error: Permission denied from=127.0.0.1:41034
TestLogoutCommand - 2019/12/06 06:38:41.122734 [DEBUG] http: Request POST /v1/acl/logout (20.503481ms) from=127.0.0.1:41034
TestLogoutCommand_k8s - 2019/12/06 06:38:41.377480 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLogoutCommand_k8s - 2019/12/06 06:38:41.380099 [DEBUG] http: Request PUT /v1/acl/auth-method (285.855038ms) from=127.0.0.1:37160
TestLogoutCommand_k8s - 2019/12/06 06:38:41.393558 [DEBUG] acl: updating cached auth method validator for "k8s"
TestLogoutCommand - 2019/12/06 06:38:41.469162 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLogoutCommand - 2019/12/06 06:38:41.472166 [DEBUG] http: Request PUT /v1/acl/auth-method (319.454493ms) from=127.0.0.1:41030
TestLogoutCommand - 2019/12/06 06:38:41.476146 [DEBUG] acl: updating cached auth method validator for "test"
TestLogoutCommand_k8s - 2019/12/06 06:38:41.712073 [DEBUG] http: Request PUT /v1/acl/binding-rule (326.879001ms) from=127.0.0.1:37160
TestLogoutCommand - 2019/12/06 06:38:41.769970 [DEBUG] http: Request PUT /v1/acl/binding-rule (294.812914ms) from=127.0.0.1:41030
TestLogoutCommand - 2019/12/06 06:38:42.037392 [DEBUG] http: Request POST /v1/acl/login (259.06941ms) from=127.0.0.1:41030
=== RUN   TestLogoutCommand/logout_of_login_token
TestLogoutCommand_k8s - 2019/12/06 06:38:42.120707 [DEBUG] http: Request POST /v1/acl/login (405.618847ms) from=127.0.0.1:37160
=== RUN   TestLogoutCommand_k8s/logout_of_login_token
TestLogoutCommand - 2019/12/06 06:38:42.353657 [DEBUG] http: Request POST /v1/acl/logout (310.334946ms) from=127.0.0.1:41038
TestLogoutCommand - 2019/12/06 06:38:42.355261 [INFO] agent: Requesting shutdown
TestLogoutCommand - 2019/12/06 06:38:42.355338 [INFO] consul: shutting down server
TestLogoutCommand - 2019/12/06 06:38:42.355385 [WARN] serf: Shutdown without a Leave
TestLogoutCommand_k8s - 2019/12/06 06:38:42.412036 [DEBUG] http: Request POST /v1/acl/logout (285.413361ms) from=127.0.0.1:37174
TestLogoutCommand_k8s - 2019/12/06 06:38:42.414167 [INFO] agent: Requesting shutdown
TestLogoutCommand_k8s - 2019/12/06 06:38:42.414312 [INFO] consul: shutting down server
TestLogoutCommand_k8s - 2019/12/06 06:38:42.414360 [WARN] serf: Shutdown without a Leave
TestLogoutCommand - 2019/12/06 06:38:42.468673 [WARN] serf: Shutdown without a Leave
TestLogoutCommand_k8s - 2019/12/06 06:38:42.535387 [WARN] serf: Shutdown without a Leave
TestLogoutCommand - 2019/12/06 06:38:42.538161 [INFO] manager: shutting down
TestLogoutCommand - 2019/12/06 06:38:42.538907 [INFO] agent: consul server down
TestLogoutCommand - 2019/12/06 06:38:42.538967 [INFO] agent: shutdown complete
TestLogoutCommand - 2019/12/06 06:38:42.539025 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestLogoutCommand - 2019/12/06 06:38:42.539155 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestLogoutCommand - 2019/12/06 06:38:42.539373 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestLogoutCommand - 2019/12/06 06:38:42.540356 [INFO] agent: Waiting for endpoints to shut down
TestLogoutCommand - 2019/12/06 06:38:42.540496 [INFO] agent: Endpoints down
--- PASS: TestLogoutCommand (8.27s)
    --- PASS: TestLogoutCommand/no_token_specified (0.05s)
    --- PASS: TestLogoutCommand/logout_of_deleted_token (0.01s)
    --- PASS: TestLogoutCommand/logout_of_ordinary_token (0.04s)
    --- PASS: TestLogoutCommand/logout_of_login_token (0.32s)
TestLogoutCommand_k8s - 2019/12/06 06:38:42.593832 [INFO] manager: shutting down
TestLogoutCommand_k8s - 2019/12/06 06:38:42.594611 [INFO] agent: consul server down
TestLogoutCommand_k8s - 2019/12/06 06:38:42.594672 [INFO] agent: shutdown complete
TestLogoutCommand_k8s - 2019/12/06 06:38:42.594754 [INFO] agent: Stopping DNS server 127.0.0.1:14507 (tcp)
TestLogoutCommand_k8s - 2019/12/06 06:38:42.594909 [INFO] agent: Stopping DNS server 127.0.0.1:14507 (udp)
TestLogoutCommand_k8s - 2019/12/06 06:38:42.595075 [INFO] agent: Stopping HTTP server 127.0.0.1:14508 (tcp)
TestLogoutCommand_k8s - 2019/12/06 06:38:42.596055 [INFO] agent: Waiting for endpoints to shut down
TestLogoutCommand_k8s - 2019/12/06 06:38:42.596221 [INFO] agent: Endpoints down
--- PASS: TestLogoutCommand_k8s (8.33s)
    --- PASS: TestLogoutCommand_k8s/no_token_specified (0.03s)
    --- PASS: TestLogoutCommand_k8s/logout_of_deleted_token (0.04s)
    --- PASS: TestLogoutCommand_k8s/logout_of_ordinary_token (0.01s)
    --- PASS: TestLogoutCommand_k8s/logout_of_login_token (0.29s)
PASS
ok  	github.com/hashicorp/consul/command/logout	8.783s
=== RUN   TestMaintCommand_noTabs
=== PAUSE TestMaintCommand_noTabs
=== RUN   TestMaintCommand_ConflictingArgs
=== PAUSE TestMaintCommand_ConflictingArgs
=== RUN   TestMaintCommand_NoArgs
=== PAUSE TestMaintCommand_NoArgs
=== RUN   TestMaintCommand_EnableNodeMaintenance
=== PAUSE TestMaintCommand_EnableNodeMaintenance
=== RUN   TestMaintCommand_DisableNodeMaintenance
=== PAUSE TestMaintCommand_DisableNodeMaintenance
=== RUN   TestMaintCommand_EnableServiceMaintenance
=== PAUSE TestMaintCommand_EnableServiceMaintenance
=== RUN   TestMaintCommand_DisableServiceMaintenance
=== PAUSE TestMaintCommand_DisableServiceMaintenance
=== RUN   TestMaintCommand_ServiceMaintenance_NoService
=== PAUSE TestMaintCommand_ServiceMaintenance_NoService
=== CONT  TestMaintCommand_noTabs
=== CONT  TestMaintCommand_EnableServiceMaintenance
=== CONT  TestMaintCommand_ServiceMaintenance_NoService
=== CONT  TestMaintCommand_DisableServiceMaintenance
--- PASS: TestMaintCommand_noTabs (0.02s)
=== CONT  TestMaintCommand_DisableNodeMaintenance
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:48.593349 [WARN] agent: Node name "Node f0f760f2-40cd-d543-a1d3-87f1e29c9fff" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:48.594290 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:48.614632 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:48.620590 [WARN] agent: Node name "Node 50e729ad-dc43-173d-5046-52dc87e2a681" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:48.621143 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:48.629407 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:48.667661 [WARN] agent: Node name "Node 7c68d64d-f1cc-9e17-3114-c2fd3c9566b2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:48.668219 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:48.680045 [WARN] agent: Node name "Node c40cfe27-156e-d215-e1f0-86a0ab650077" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:48.680695 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:48.681782 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:48.687087 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:38:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f0f760f2-40cd-d543-a1d3-87f1e29c9fff Address:127.0.0.1:25006}]
2019/12/06 06:38:49 [INFO]  raft: Node at 127.0.0.1:25006 [Follower] entering Follower state (Leader: "")
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:49.773832 [INFO] serf: EventMemberJoin: Node f0f760f2-40cd-d543-a1d3-87f1e29c9fff.dc1 127.0.0.1
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:49.777628 [INFO] serf: EventMemberJoin: Node f0f760f2-40cd-d543-a1d3-87f1e29c9fff 127.0.0.1
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:49.779284 [INFO] consul: Adding LAN server Node f0f760f2-40cd-d543-a1d3-87f1e29c9fff (Addr: tcp/127.0.0.1:25006) (DC: dc1)
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:49.779998 [INFO] consul: Handled member-join event for server "Node f0f760f2-40cd-d543-a1d3-87f1e29c9fff.dc1" in area "wan"
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:49.780131 [INFO] agent: Started DNS server 127.0.0.1:25001 (udp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:49.780527 [INFO] agent: Started DNS server 127.0.0.1:25001 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:49.783355 [INFO] agent: Started HTTP server on 127.0.0.1:25002 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:49.783495 [INFO] agent: started state syncer
2019/12/06 06:38:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:49 [INFO]  raft: Node at 127.0.0.1:25006 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:50e729ad-dc43-173d-5046-52dc87e2a681 Address:127.0.0.1:25012}]
2019/12/06 06:38:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c40cfe27-156e-d215-e1f0-86a0ab650077 Address:127.0.0.1:25018}]
2019/12/06 06:38:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7c68d64d-f1cc-9e17-3114-c2fd3c9566b2 Address:127.0.0.1:25024}]
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.106237 [INFO] serf: EventMemberJoin: Node 50e729ad-dc43-173d-5046-52dc87e2a681.dc1 127.0.0.1
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.107235 [INFO] serf: EventMemberJoin: Node c40cfe27-156e-d215-e1f0-86a0ab650077.dc1 127.0.0.1
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25018 [Follower] entering Follower state (Leader: "")
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25012 [Follower] entering Follower state (Leader: "")
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25024 [Follower] entering Follower state (Leader: "")
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.111869 [INFO] serf: EventMemberJoin: Node 7c68d64d-f1cc-9e17-3114-c2fd3c9566b2.dc1 127.0.0.1
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.141927 [INFO] serf: EventMemberJoin: Node c40cfe27-156e-d215-e1f0-86a0ab650077 127.0.0.1
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.142364 [INFO] serf: EventMemberJoin: Node 50e729ad-dc43-173d-5046-52dc87e2a681 127.0.0.1
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.143203 [INFO] serf: EventMemberJoin: Node 7c68d64d-f1cc-9e17-3114-c2fd3c9566b2 127.0.0.1
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.143312 [INFO] agent: Started DNS server 127.0.0.1:25013 (udp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.143659 [INFO] consul: Adding LAN server Node 50e729ad-dc43-173d-5046-52dc87e2a681 (Addr: tcp/127.0.0.1:25012) (DC: dc1)
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.143711 [INFO] consul: Adding LAN server Node c40cfe27-156e-d215-e1f0-86a0ab650077 (Addr: tcp/127.0.0.1:25018) (DC: dc1)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.143863 [INFO] consul: Handled member-join event for server "Node 50e729ad-dc43-173d-5046-52dc87e2a681.dc1" in area "wan"
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.143880 [INFO] consul: Handled member-join event for server "Node c40cfe27-156e-d215-e1f0-86a0ab650077.dc1" in area "wan"
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.145092 [INFO] agent: Started DNS server 127.0.0.1:25013 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.145828 [INFO] consul: Adding LAN server Node 7c68d64d-f1cc-9e17-3114-c2fd3c9566b2 (Addr: tcp/127.0.0.1:25024) (DC: dc1)
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.146030 [INFO] consul: Handled member-join event for server "Node 7c68d64d-f1cc-9e17-3114-c2fd3c9566b2.dc1" in area "wan"
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.145197 [INFO] agent: Started DNS server 127.0.0.1:25007 (udp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.149680 [INFO] agent: Started DNS server 127.0.0.1:25007 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.157912 [INFO] agent: Started HTTP server on 127.0.0.1:25008 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.158060 [INFO] agent: started state syncer
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.161607 [INFO] agent: Started DNS server 127.0.0.1:25019 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.162695 [INFO] agent: Started HTTP server on 127.0.0.1:25014 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.162821 [INFO] agent: started state syncer
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.165279 [INFO] agent: Started DNS server 127.0.0.1:25019 (udp)
2019/12/06 06:38:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25018 [Candidate] entering Candidate state in term 2
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.171224 [INFO] agent: Started HTTP server on 127.0.0.1:25020 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.171466 [INFO] agent: started state syncer
2019/12/06 06:38:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25012 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25024 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25006 [Leader] entering Leader state
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:50.411042 [INFO] consul: cluster leadership acquired
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:50.411611 [INFO] consul: New leader elected: Node f0f760f2-40cd-d543-a1d3-87f1e29c9fff
2019/12/06 06:38:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25018 [Leader] entering Leader state
2019/12/06 06:38:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25024 [Leader] entering Leader state
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.735976 [INFO] consul: cluster leadership acquired
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.736158 [INFO] consul: cluster leadership acquired
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:50.736499 [INFO] consul: New leader elected: Node c40cfe27-156e-d215-e1f0-86a0ab650077
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:50.736525 [INFO] consul: New leader elected: Node 7c68d64d-f1cc-9e17-3114-c2fd3c9566b2
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:50.741724 [INFO] agent: Synced service "test"
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:50.741801 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:50.747507 [DEBUG] http: Request GET /v1/agent/self (255.670663ms) from=127.0.0.1:59104
2019/12/06 06:38:50 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:50 [INFO]  raft: Node at 127.0.0.1:25012 [Leader] entering Leader state
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.754423 [INFO] consul: cluster leadership acquired
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:50.754891 [INFO] consul: New leader elected: Node 50e729ad-dc43-173d-5046-52dc87e2a681
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:50.903227 [INFO] agent: Service "test" entered maintenance mode
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:50.903363 [DEBUG] agent: Service "test" in sync
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.045046 [INFO] agent: Synced node info
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.045223 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:51.049049 [INFO] agent: Synced node info
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.045046 [INFO] agent: Synced node info
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.058885 [DEBUG] http: Request GET /v1/agent/self (245.319754ms) from=127.0.0.1:54184
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.060180 [DEBUG] http: Request GET /v1/agent/self (40.611952ms) from=127.0.0.1:46214
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.077174 [DEBUG] http: Request PUT /v1/agent/service/maintenance/redis?enable=true&reason=broken (4.656109ms) from=127.0.0.1:54184
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.084371 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.084473 [DEBUG] http: Request PUT /v1/agent/maintenance?enable=false (113.336µs) from=127.0.0.1:46214
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.085084 [INFO] agent: Requesting shutdown
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.085158 [INFO] consul: shutting down server
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.085252 [WARN] serf: Shutdown without a Leave
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.085590 [INFO] agent: Requesting shutdown
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.085656 [INFO] consul: shutting down server
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.085702 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.236514 [WARN] serf: Shutdown without a Leave
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.237975 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.320908 [INFO] manager: shutting down
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.320908 [INFO] manager: shutting down
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.321397 [INFO] agent: consul server down
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.321457 [INFO] agent: shutdown complete
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.321516 [INFO] agent: Stopping DNS server 127.0.0.1:25019 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.321664 [INFO] agent: Stopping DNS server 127.0.0.1:25019 (udp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.321825 [INFO] agent: Stopping HTTP server 127.0.0.1:25020 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.322431 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_DisableNodeMaintenance - 2019/12/06 06:38:51.322591 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_DisableNodeMaintenance (2.93s)
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.322688 [INFO] agent: Synced check "_service_maintenance:test"
=== CONT  TestMaintCommand_NoArgs
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.322733 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.322793 [DEBUG] http: Request PUT /v1/agent/service/maintenance/test?enable=true&reason=broken (559.563124ms) from=127.0.0.1:59104
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.324070 [INFO] agent: Requesting shutdown
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.324160 [INFO] consul: shutting down server
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.324286 [WARN] serf: Shutdown without a Leave
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.326856 [INFO] agent: consul server down
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.326920 [INFO] agent: shutdown complete
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.326972 [INFO] agent: Stopping DNS server 127.0.0.1:25007 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.327101 [INFO] agent: Stopping DNS server 127.0.0.1:25007 (udp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.327252 [INFO] agent: Stopping HTTP server 127.0.0.1:25008 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.327747 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.327846 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.328134 [ERR] consul: failed to establish leadership: raft is already shutdown
TestMaintCommand_ServiceMaintenance_NoService - 2019/12/06 06:38:51.328458 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_ServiceMaintenance_NoService (2.95s)
=== CONT  TestMaintCommand_EnableNodeMaintenance
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:51.343979 [DEBUG] http: Request GET /v1/agent/self (285.316026ms) from=127.0.0.1:42630
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_NoArgs - 2019/12/06 06:38:51.426679 [WARN] agent: Node name "Node 317374dd-1ca0-b92d-7f12-f9e54b5d79ce" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_NoArgs - 2019/12/06 06:38:51.427398 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_NoArgs - 2019/12/06 06:38:51.432137 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:51.461997 [WARN] agent: Node name "Node c5c6ff8f-2a55-d575-6fb7-a054ea5fb3f1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:51.462750 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:51.467767 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.543933 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.710770 [INFO] manager: shutting down
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.802422 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.802685 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.802730 [INFO] agent: consul server down
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.802754 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.802829 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.802883 [ERR] consul: failed to transfer leadership in 3 attempts
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.802778 [INFO] agent: shutdown complete
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.803010 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.803252 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (udp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.803444 [INFO] agent: Stopping HTTP server 127.0.0.1:25002 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.804250 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_EnableServiceMaintenance - 2019/12/06 06:38:51.804354 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_EnableServiceMaintenance (3.44s)
=== CONT  TestMaintCommand_ConflictingArgs
--- PASS: TestMaintCommand_ConflictingArgs (0.00s)
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:51.978864 [INFO] agent: Synced service "test"
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:51.978953 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:51.979048 [DEBUG] http: Request PUT /v1/agent/service/maintenance/test?enable=false (624.203641ms) from=127.0.0.1:42630
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:51.980034 [INFO] agent: Requesting shutdown
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:51.980116 [INFO] consul: shutting down server
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:51.980170 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.060610 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.135771 [INFO] manager: shutting down
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.137671 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.137888 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.137992 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.138076 [ERR] consul: failed to transfer leadership attempt 2/3: raft is already shutdown
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.138128 [ERR] consul: failed to transfer leadership in 3 attempts
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.137891 [INFO] agent: consul server down
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.138252 [INFO] agent: shutdown complete
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.138336 [INFO] agent: Stopping DNS server 127.0.0.1:25013 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.138556 [INFO] agent: Stopping DNS server 127.0.0.1:25013 (udp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.138747 [INFO] agent: Stopping HTTP server 127.0.0.1:25014 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.139386 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_DisableServiceMaintenance - 2019/12/06 06:38:52.139486 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_DisableServiceMaintenance (3.76s)
2019/12/06 06:38:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c5c6ff8f-2a55-d575-6fb7-a054ea5fb3f1 Address:127.0.0.1:25036}]
2019/12/06 06:38:52 [INFO]  raft: Node at 127.0.0.1:25036 [Follower] entering Follower state (Leader: "")
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.444309 [INFO] serf: EventMemberJoin: Node c5c6ff8f-2a55-d575-6fb7-a054ea5fb3f1.dc1 127.0.0.1
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.448845 [INFO] serf: EventMemberJoin: Node c5c6ff8f-2a55-d575-6fb7-a054ea5fb3f1 127.0.0.1
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.449826 [INFO] consul: Adding LAN server Node c5c6ff8f-2a55-d575-6fb7-a054ea5fb3f1 (Addr: tcp/127.0.0.1:25036) (DC: dc1)
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.449990 [INFO] consul: Handled member-join event for server "Node c5c6ff8f-2a55-d575-6fb7-a054ea5fb3f1.dc1" in area "wan"
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.450975 [INFO] agent: Started DNS server 127.0.0.1:25031 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.451048 [INFO] agent: Started DNS server 127.0.0.1:25031 (udp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.453743 [INFO] agent: Started HTTP server on 127.0.0.1:25032 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.453854 [INFO] agent: started state syncer
2019/12/06 06:38:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:52 [INFO]  raft: Node at 127.0.0.1:25036 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:317374dd-1ca0-b92d-7f12-f9e54b5d79ce Address:127.0.0.1:25030}]
2019/12/06 06:38:52 [INFO]  raft: Node at 127.0.0.1:25030 [Follower] entering Follower state (Leader: "")
TestMaintCommand_NoArgs - 2019/12/06 06:38:52.498896 [INFO] serf: EventMemberJoin: Node 317374dd-1ca0-b92d-7f12-f9e54b5d79ce.dc1 127.0.0.1
TestMaintCommand_NoArgs - 2019/12/06 06:38:52.503401 [INFO] serf: EventMemberJoin: Node 317374dd-1ca0-b92d-7f12-f9e54b5d79ce 127.0.0.1
TestMaintCommand_NoArgs - 2019/12/06 06:38:52.504430 [INFO] consul: Adding LAN server Node 317374dd-1ca0-b92d-7f12-f9e54b5d79ce (Addr: tcp/127.0.0.1:25030) (DC: dc1)
TestMaintCommand_NoArgs - 2019/12/06 06:38:52.504660 [INFO] consul: Handled member-join event for server "Node 317374dd-1ca0-b92d-7f12-f9e54b5d79ce.dc1" in area "wan"
TestMaintCommand_NoArgs - 2019/12/06 06:38:52.505543 [INFO] agent: Started DNS server 127.0.0.1:25025 (tcp)
TestMaintCommand_NoArgs - 2019/12/06 06:38:52.505898 [INFO] agent: Started DNS server 127.0.0.1:25025 (udp)
TestMaintCommand_NoArgs - 2019/12/06 06:38:52.508608 [INFO] agent: Started HTTP server on 127.0.0.1:25026 (tcp)
TestMaintCommand_NoArgs - 2019/12/06 06:38:52.508731 [INFO] agent: started state syncer
2019/12/06 06:38:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:38:52 [INFO]  raft: Node at 127.0.0.1:25030 [Candidate] entering Candidate state in term 2
2019/12/06 06:38:52 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:52 [INFO]  raft: Node at 127.0.0.1:25036 [Leader] entering Leader state
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.962389 [INFO] consul: cluster leadership acquired
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:52.962874 [INFO] consul: New leader elected: Node c5c6ff8f-2a55-d575-6fb7-a054ea5fb3f1
2019/12/06 06:38:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:38:53 [INFO]  raft: Node at 127.0.0.1:25030 [Leader] entering Leader state
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.195749 [INFO] consul: cluster leadership acquired
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.196317 [INFO] consul: New leader elected: Node 317374dd-1ca0-b92d-7f12-f9e54b5d79ce
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.354889 [INFO] agent: Service "test" entered maintenance mode
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:53.578514 [INFO] agent: Synced node info
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:53.583207 [DEBUG] http: Request GET /v1/agent/self (480.337266ms) from=127.0.0.1:43040
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.590000 [INFO] agent: Synced service "test"
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.598274 [DEBUG] agent: Check "_service_maintenance:test" in sync
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.598816 [DEBUG] agent: Node info in sync
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.601886 [DEBUG] agent: Service "test" in sync
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.602165 [DEBUG] agent: Check "_service_maintenance:test" in sync
TestMaintCommand_NoArgs - 2019/12/06 06:38:53.729507 [INFO] agent: Node entered maintenance mode
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:53.738562 [INFO] agent: Node entered maintenance mode
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.136937 [INFO] agent: Synced check "_node_maintenance"
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.137023 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.137089 [DEBUG] http: Request PUT /v1/agent/maintenance?enable=true&reason=broken (534.473869ms) from=127.0.0.1:43040
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.140439 [INFO] agent: Synced check "_node_maintenance"
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.140596 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.142987 [INFO] agent: Requesting shutdown
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.143060 [INFO] consul: shutting down server
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.143103 [WARN] serf: Shutdown without a Leave
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.146609 [DEBUG] http: Request GET /v1/agent/self (411.921662ms) from=127.0.0.1:60708
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.193111 [DEBUG] http: Request GET /v1/agent/checks (721.35µs) from=127.0.0.1:60708
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.196229 [INFO] agent: Requesting shutdown
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.196352 [INFO] consul: shutting down server
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.196401 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.227221 [WARN] serf: Shutdown without a Leave
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.302287 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.305706 [INFO] manager: shutting down
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.306108 [INFO] agent: consul server down
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.306154 [INFO] agent: shutdown complete
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.306200 [INFO] agent: Stopping DNS server 127.0.0.1:25031 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.306322 [INFO] agent: Stopping DNS server 127.0.0.1:25031 (udp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.306476 [INFO] agent: Stopping HTTP server 127.0.0.1:25032 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.307014 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.307098 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_EnableNodeMaintenance (2.98s)
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.317922 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.318165 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestMaintCommand_EnableNodeMaintenance - 2019/12/06 06:38:54.318233 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.377341 [INFO] manager: shutting down
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.518970 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.519387 [INFO] agent: consul server down
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.519445 [INFO] agent: shutdown complete
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.519502 [INFO] agent: Stopping DNS server 127.0.0.1:25025 (tcp)
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.519657 [INFO] agent: Stopping DNS server 127.0.0.1:25025 (udp)
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.519816 [INFO] agent: Stopping HTTP server 127.0.0.1:25026 (tcp)
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.520392 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_NoArgs - 2019/12/06 06:38:54.520485 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_NoArgs (3.20s)
PASS
ok  	github.com/hashicorp/consul/command/maint	6.482s
=== RUN   TestMembersCommand_noTabs
=== PAUSE TestMembersCommand_noTabs
=== RUN   TestMembersCommand
=== PAUSE TestMembersCommand
=== RUN   TestMembersCommand_WAN
=== PAUSE TestMembersCommand_WAN
=== RUN   TestMembersCommand_statusFilter
=== PAUSE TestMembersCommand_statusFilter
=== RUN   TestMembersCommand_statusFilter_failed
=== PAUSE TestMembersCommand_statusFilter_failed
=== CONT  TestMembersCommand_noTabs
=== CONT  TestMembersCommand_statusFilter
=== CONT  TestMembersCommand_statusFilter_failed
=== CONT  TestMembersCommand_WAN
--- PASS: TestMembersCommand_noTabs (0.01s)
=== CONT  TestMembersCommand
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_WAN - 2019/12/06 06:39:22.865399 [WARN] agent: Node name "Node b7eb8db2-5768-d150-1384-c5ee15140ba5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand_WAN - 2019/12/06 06:39:22.866175 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand - 2019/12/06 06:39:22.888830 [WARN] agent: Node name "Node eb0d30d1-7880-0988-9eb9-b7a1b53ea95e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand - 2019/12/06 06:39:22.894364 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_WAN - 2019/12/06 06:39:22.896494 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMembersCommand - 2019/12/06 06:39:22.905063 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_statusFilter - 2019/12/06 06:39:22.969795 [WARN] agent: Node name "Node 2dc065d3-c742-e29d-d42f-54026eee7e2c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_statusFilter - 2019/12/06 06:39:22.974394 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:22.976114 [WARN] agent: Node name "Node c0008f21-2dee-48c8-9ba1-c3d5721a8e6e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:22.976713 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:22.986278 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMembersCommand_statusFilter - 2019/12/06 06:39:22.986278 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:39:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b7eb8db2-5768-d150-1384-c5ee15140ba5 Address:127.0.0.1:20512}]
2019/12/06 06:39:23 [INFO]  raft: Node at 127.0.0.1:20512 [Follower] entering Follower state (Leader: "")
TestMembersCommand_WAN - 2019/12/06 06:39:23.899682 [INFO] serf: EventMemberJoin: Node b7eb8db2-5768-d150-1384-c5ee15140ba5.dc1 127.0.0.1
TestMembersCommand_WAN - 2019/12/06 06:39:23.904502 [INFO] serf: EventMemberJoin: Node b7eb8db2-5768-d150-1384-c5ee15140ba5 127.0.0.1
TestMembersCommand_WAN - 2019/12/06 06:39:23.906108 [INFO] consul: Adding LAN server Node b7eb8db2-5768-d150-1384-c5ee15140ba5 (Addr: tcp/127.0.0.1:20512) (DC: dc1)
TestMembersCommand_WAN - 2019/12/06 06:39:23.906577 [INFO] consul: Handled member-join event for server "Node b7eb8db2-5768-d150-1384-c5ee15140ba5.dc1" in area "wan"
TestMembersCommand_WAN - 2019/12/06 06:39:23.910011 [INFO] agent: Started DNS server 127.0.0.1:20507 (tcp)
TestMembersCommand_WAN - 2019/12/06 06:39:23.910972 [INFO] agent: Started DNS server 127.0.0.1:20507 (udp)
TestMembersCommand_WAN - 2019/12/06 06:39:23.914085 [INFO] agent: Started HTTP server on 127.0.0.1:20508 (tcp)
TestMembersCommand_WAN - 2019/12/06 06:39:23.914318 [INFO] agent: started state syncer
2019/12/06 06:39:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:39:23 [INFO]  raft: Node at 127.0.0.1:20512 [Candidate] entering Candidate state in term 2
2019/12/06 06:39:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2dc065d3-c742-e29d-d42f-54026eee7e2c Address:127.0.0.1:20524}]
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20524 [Follower] entering Follower state (Leader: "")
2019/12/06 06:39:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eb0d30d1-7880-0988-9eb9-b7a1b53ea95e Address:127.0.0.1:20518}]
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20518 [Follower] entering Follower state (Leader: "")
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.182198 [INFO] serf: EventMemberJoin: Node 2dc065d3-c742-e29d-d42f-54026eee7e2c.dc1 127.0.0.1
TestMembersCommand - 2019/12/06 06:39:24.183229 [INFO] serf: EventMemberJoin: Node eb0d30d1-7880-0988-9eb9-b7a1b53ea95e.dc1 127.0.0.1
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.188787 [INFO] serf: EventMemberJoin: Node 2dc065d3-c742-e29d-d42f-54026eee7e2c 127.0.0.1
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.192924 [INFO] consul: Adding LAN server Node 2dc065d3-c742-e29d-d42f-54026eee7e2c (Addr: tcp/127.0.0.1:20524) (DC: dc1)
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.193126 [INFO] consul: Handled member-join event for server "Node 2dc065d3-c742-e29d-d42f-54026eee7e2c.dc1" in area "wan"
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.210140 [INFO] agent: Started DNS server 127.0.0.1:20519 (tcp)
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.210226 [INFO] agent: Started DNS server 127.0.0.1:20519 (udp)
TestMembersCommand - 2019/12/06 06:39:24.211968 [INFO] serf: EventMemberJoin: Node eb0d30d1-7880-0988-9eb9-b7a1b53ea95e 127.0.0.1
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.212719 [INFO] agent: Started HTTP server on 127.0.0.1:20520 (tcp)
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.212802 [INFO] agent: started state syncer
TestMembersCommand - 2019/12/06 06:39:24.213248 [INFO] agent: Started DNS server 127.0.0.1:20513 (udp)
TestMembersCommand - 2019/12/06 06:39:24.213429 [INFO] consul: Handled member-join event for server "Node eb0d30d1-7880-0988-9eb9-b7a1b53ea95e.dc1" in area "wan"
TestMembersCommand - 2019/12/06 06:39:24.213570 [INFO] consul: Adding LAN server Node eb0d30d1-7880-0988-9eb9-b7a1b53ea95e (Addr: tcp/127.0.0.1:20518) (DC: dc1)
TestMembersCommand - 2019/12/06 06:39:24.213876 [INFO] agent: Started DNS server 127.0.0.1:20513 (tcp)
2019/12/06 06:39:24 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c0008f21-2dee-48c8-9ba1-c3d5721a8e6e Address:127.0.0.1:20506}]
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.221418 [INFO] serf: EventMemberJoin: Node c0008f21-2dee-48c8-9ba1-c3d5721a8e6e.dc1 127.0.0.1
2019/12/06 06:39:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20524 [Candidate] entering Candidate state in term 2
2019/12/06 06:39:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20518 [Candidate] entering Candidate state in term 2
TestMembersCommand - 2019/12/06 06:39:24.246895 [INFO] agent: Started HTTP server on 127.0.0.1:20514 (tcp)
TestMembersCommand - 2019/12/06 06:39:24.247003 [INFO] agent: started state syncer
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.249681 [INFO] serf: EventMemberJoin: Node c0008f21-2dee-48c8-9ba1-c3d5721a8e6e 127.0.0.1
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.251549 [INFO] consul: Handled member-join event for server "Node c0008f21-2dee-48c8-9ba1-c3d5721a8e6e.dc1" in area "wan"
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.252118 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.252939 [INFO] consul: Adding LAN server Node c0008f21-2dee-48c8-9ba1-c3d5721a8e6e (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.253022 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
2019/12/06 06:39:24 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.264965 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.265238 [INFO] agent: started state syncer
2019/12/06 06:39:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20512 [Leader] entering Leader state
TestMembersCommand_WAN - 2019/12/06 06:39:24.586864 [INFO] consul: cluster leadership acquired
TestMembersCommand_WAN - 2019/12/06 06:39:24.587442 [INFO] consul: New leader elected: Node b7eb8db2-5768-d150-1384-c5ee15140ba5
TestMembersCommand_WAN - 2019/12/06 06:39:24.636229 [DEBUG] http: Request GET /v1/agent/members?segment=_all&wan=1 (1.605704ms) from=127.0.0.1:54984
TestMembersCommand_WAN - 2019/12/06 06:39:24.640188 [INFO] agent: Requesting shutdown
TestMembersCommand_WAN - 2019/12/06 06:39:24.640287 [INFO] consul: shutting down server
TestMembersCommand_WAN - 2019/12/06 06:39:24.640350 [WARN] serf: Shutdown without a Leave
TestMembersCommand_WAN - 2019/12/06 06:39:24.640725 [ERR] agent: failed to sync remote state: No cluster leader
TestMembersCommand_WAN - 2019/12/06 06:39:24.794400 [WARN] serf: Shutdown without a Leave
TestMembersCommand_WAN - 2019/12/06 06:39:24.869516 [INFO] manager: shutting down
TestMembersCommand_WAN - 2019/12/06 06:39:24.871851 [ERR] agent: failed to sync remote state: No cluster leader
2019/12/06 06:39:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20524 [Leader] entering Leader state
2019/12/06 06:39:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20518 [Leader] entering Leader state
TestMembersCommand - 2019/12/06 06:39:24.971543 [INFO] consul: cluster leadership acquired
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.971551 [INFO] consul: cluster leadership acquired
2019/12/06 06:39:24 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:39:24 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestMembersCommand - 2019/12/06 06:39:24.971948 [INFO] consul: New leader elected: Node eb0d30d1-7880-0988-9eb9-b7a1b53ea95e
TestMembersCommand_statusFilter - 2019/12/06 06:39:24.972278 [INFO] consul: New leader elected: Node 2dc065d3-c742-e29d-d42f-54026eee7e2c
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.972493 [INFO] consul: cluster leadership acquired
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:24.972854 [INFO] consul: New leader elected: Node c0008f21-2dee-48c8-9ba1-c3d5721a8e6e
TestMembersCommand - 2019/12/06 06:39:24.985589 [DEBUG] http: Request GET /v1/agent/members?segment=_all (887.354µs) from=127.0.0.1:56990
TestMembersCommand - 2019/12/06 06:39:24.988974 [INFO] agent: Requesting shutdown
TestMembersCommand - 2019/12/06 06:39:24.989083 [INFO] consul: shutting down server
TestMembersCommand - 2019/12/06 06:39:24.989130 [WARN] serf: Shutdown without a Leave
TestMembersCommand - 2019/12/06 06:39:24.989380 [ERR] agent: failed to sync remote state: No cluster leader
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.010484 [DEBUG] http: Request GET /v1/agent/members?segment=_all (1.041358ms) from=127.0.0.1:49554
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.012476 [INFO] agent: Requesting shutdown
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.012561 [INFO] consul: shutting down server
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.012607 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.012972 [ERR] agent: failed to sync remote state: No cluster leader
TestMembersCommand_WAN - 2019/12/06 06:39:25.069784 [INFO] agent: consul server down
TestMembersCommand_WAN - 2019/12/06 06:39:25.069873 [INFO] agent: shutdown complete
TestMembersCommand_WAN - 2019/12/06 06:39:25.069925 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestMembersCommand - 2019/12/06 06:39:25.069869 [WARN] serf: Shutdown without a Leave
TestMembersCommand_WAN - 2019/12/06 06:39:25.069932 [INFO] agent: Stopping DNS server 127.0.0.1:20507 (tcp)
TestMembersCommand_WAN - 2019/12/06 06:39:25.070151 [INFO] agent: Stopping DNS server 127.0.0.1:20507 (udp)
TestMembersCommand_WAN - 2019/12/06 06:39:25.070381 [INFO] agent: Stopping HTTP server 127.0.0.1:20508 (tcp)
TestMembersCommand_WAN - 2019/12/06 06:39:25.070945 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_WAN - 2019/12/06 06:39:25.071033 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_WAN (2.41s)
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.161017 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.186421 [DEBUG] http: Request GET /v1/agent/members?segment=_all (1.063359ms) from=127.0.0.1:48712
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.188399 [INFO] agent: Requesting shutdown
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.188493 [INFO] consul: shutting down server
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.188546 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.188919 [ERR] agent: failed to sync remote state: No cluster leader
TestMembersCommand - 2019/12/06 06:39:25.236194 [INFO] manager: shutting down
TestMembersCommand - 2019/12/06 06:39:25.236398 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestMembersCommand - 2019/12/06 06:39:25.237053 [INFO] agent: consul server down
TestMembersCommand - 2019/12/06 06:39:25.237110 [INFO] agent: shutdown complete
TestMembersCommand - 2019/12/06 06:39:25.237170 [INFO] agent: Stopping DNS server 127.0.0.1:20513 (tcp)
TestMembersCommand - 2019/12/06 06:39:25.237333 [INFO] agent: Stopping DNS server 127.0.0.1:20513 (udp)
TestMembersCommand - 2019/12/06 06:39:25.237500 [INFO] agent: Stopping HTTP server 127.0.0.1:20514 (tcp)
TestMembersCommand - 2019/12/06 06:39:25.238141 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand - 2019/12/06 06:39:25.238372 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand (2.56s)
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.238885 [INFO] manager: shutting down
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.327672 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.395428 [INFO] manager: shutting down
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.453580 [ERR] agent: failed to sync remote state: No cluster leader
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.469477 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.469604 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.469780 [INFO] agent: consul server down
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.469833 [INFO] agent: shutdown complete
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.469864 [INFO] agent: consul server down
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.469894 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.469905 [INFO] agent: shutdown complete
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.469959 [INFO] agent: Stopping DNS server 127.0.0.1:20519 (tcp)
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.470022 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.470081 [INFO] agent: Stopping DNS server 127.0.0.1:20519 (udp)
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.470188 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.470260 [INFO] agent: Stopping HTTP server 127.0.0.1:20520 (tcp)
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.470740 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.470739 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_statusFilter_failed - 2019/12/06 06:39:25.470889 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_statusFilter_failed (2.81s)
TestMembersCommand_statusFilter - 2019/12/06 06:39:25.470889 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_statusFilter (2.81s)
PASS
ok  	github.com/hashicorp/consul/command/members	3.301s
=== RUN   TestMonitorCommand_exitsOnSignalBeforeLinesArrive
=== PAUSE TestMonitorCommand_exitsOnSignalBeforeLinesArrive
=== CONT  TestMonitorCommand_exitsOnSignalBeforeLinesArrive
WARNING: bootstrap = true: do not enable unless necessary
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:24.052178 [WARN] agent: Node name "Node 286cbede-46bb-4336-889e-9cac52008b06" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:24.053426 [DEBUG] tlsutil: Update with version 1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:24.061407 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:39:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:286cbede-46bb-4336-889e-9cac52008b06 Address:127.0.0.1:23506}]
2019/12/06 06:39:25 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.243910 [INFO] serf: EventMemberJoin: Node 286cbede-46bb-4336-889e-9cac52008b06.dc1 127.0.0.1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.257898 [INFO] serf: EventMemberJoin: Node 286cbede-46bb-4336-889e-9cac52008b06 127.0.0.1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.262649 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.281408 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
2019/12/06 06:39:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:39:25 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.262952 [INFO] consul: Adding LAN server Node 286cbede-46bb-4336-889e-9cac52008b06 (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.263366 [INFO] consul: Handled member-join event for server "Node 286cbede-46bb-4336-889e-9cac52008b06.dc1" in area "wan"
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.285881 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.286236 [INFO] agent: started state syncer
2019/12/06 06:39:25 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:39:25 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.822040 [INFO] consul: cluster leadership acquired
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:25.824954 [INFO] consul: New leader elected: Node 286cbede-46bb-4336-889e-9cac52008b06
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:26.337132 [INFO] agent: Synced node info
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:26.337313 [DEBUG] agent: Node info in sync
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.070944 [INFO] agent: Requesting shutdown
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.071049 [INFO] consul: shutting down server
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.071101 [WARN] serf: Shutdown without a Leave
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.177937 [WARN] serf: Shutdown without a Leave
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.237029 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.237552 [DEBUG] consul: Skipping self join check for "Node 286cbede-46bb-4336-889e-9cac52008b06" since the cluster is too small
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.237720 [INFO] consul: member 'Node 286cbede-46bb-4336-889e-9cac52008b06' joined, marking health alive
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.311168 [INFO] manager: shutting down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.444710 [INFO] agent: consul server down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.444801 [INFO] agent: shutdown complete
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.444869 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.445030 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.445200 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:27.445812 [ERR] consul: failed to reconcile member: {Node 286cbede-46bb-4336-889e-9cac52008b06 127.0.0.1 23504 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:286cbede-46bb-4336-889e-9cac52008b06 port:23506 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:23505] alive 1 5 2 2 5 4}: leadership lost while committing log
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:28.445645 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:23502 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:28.445819 [INFO] agent: Waiting for endpoints to shut down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/12/06 06:39:28.445872 [INFO] agent: Endpoints down
--- FAIL: TestMonitorCommand_exitsOnSignalBeforeLinesArrive (4.57s)
    monitor_test.go:70: timed out waiting for exit
FAIL
FAIL	github.com/hashicorp/consul/command/monitor	5.021s
=== RUN   TestOperatorCommand_noTabs
=== PAUSE TestOperatorCommand_noTabs
=== CONT  TestOperatorCommand_noTabs
--- PASS: TestOperatorCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator	0.037s
=== RUN   TestOperatorAutopilotCommand_noTabs
=== PAUSE TestOperatorAutopilotCommand_noTabs
=== CONT  TestOperatorAutopilotCommand_noTabs
--- PASS: TestOperatorAutopilotCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot	0.150s
=== RUN   TestOperatorAutopilotGetConfigCommand_noTabs
=== PAUSE TestOperatorAutopilotGetConfigCommand_noTabs
=== RUN   TestOperatorAutopilotGetConfigCommand
=== PAUSE TestOperatorAutopilotGetConfigCommand
=== CONT  TestOperatorAutopilotGetConfigCommand_noTabs
--- PASS: TestOperatorAutopilotGetConfigCommand_noTabs (0.00s)
=== CONT  TestOperatorAutopilotGetConfigCommand
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.088945 [WARN] agent: Node name "Node 5c9353ff-2366-1d5b-b513-a48bcbe3eb24" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.090055 [DEBUG] tlsutil: Update with version 1
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.097026 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:39:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5c9353ff-2366-1d5b-b513-a48bcbe3eb24 Address:127.0.0.1:40006}]
2019/12/06 06:39:44 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.949918 [INFO] serf: EventMemberJoin: Node 5c9353ff-2366-1d5b-b513-a48bcbe3eb24.dc1 127.0.0.1
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.953148 [INFO] serf: EventMemberJoin: Node 5c9353ff-2366-1d5b-b513-a48bcbe3eb24 127.0.0.1
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.955407 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.955574 [INFO] consul: Handled member-join event for server "Node 5c9353ff-2366-1d5b-b513-a48bcbe3eb24.dc1" in area "wan"
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.955875 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.956124 [INFO] consul: Adding LAN server Node 5c9353ff-2366-1d5b-b513-a48bcbe3eb24 (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.958738 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:44.958944 [INFO] agent: started state syncer
2019/12/06 06:39:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:39:45 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
2019/12/06 06:39:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:39:45 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:45.453728 [INFO] consul: cluster leadership acquired
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:45.454259 [INFO] consul: New leader elected: Node 5c9353ff-2366-1d5b-b513-a48bcbe3eb24
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:45.762411 [INFO] agent: Synced node info
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:45.762543 [DEBUG] agent: Node info in sync
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:45.791042 [DEBUG] agent: Node info in sync
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.628939 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.629578 [DEBUG] consul: Skipping self join check for "Node 5c9353ff-2366-1d5b-b513-a48bcbe3eb24" since the cluster is too small
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.629799 [INFO] consul: member 'Node 5c9353ff-2366-1d5b-b513-a48bcbe3eb24' joined, marking health alive
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.833388 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (6.974496ms) from=127.0.0.1:57324
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.844429 [INFO] agent: Requesting shutdown
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.844553 [INFO] consul: shutting down server
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.844650 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.919655 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.978114 [INFO] manager: shutting down
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.978575 [INFO] agent: consul server down
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.978623 [INFO] agent: shutdown complete
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.978674 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.978802 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.978950 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.979555 [INFO] agent: Waiting for endpoints to shut down
TestOperatorAutopilotGetConfigCommand - 2019/12/06 06:39:46.979613 [INFO] agent: Endpoints down
--- PASS: TestOperatorAutopilotGetConfigCommand (2.96s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot/get	3.229s
=== RUN   TestOperatorAutopilotSetConfigCommand_noTabs
=== PAUSE TestOperatorAutopilotSetConfigCommand_noTabs
=== RUN   TestOperatorAutopilotSetConfigCommand
=== PAUSE TestOperatorAutopilotSetConfigCommand
=== CONT  TestOperatorAutopilotSetConfigCommand_noTabs
=== CONT  TestOperatorAutopilotSetConfigCommand
--- PASS: TestOperatorAutopilotSetConfigCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.036557 [WARN] agent: Node name "Node 2a0b825d-395e-5ad3-76e7-6c677310034a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.037475 [DEBUG] tlsutil: Update with version 1
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.058018 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:39:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2a0b825d-395e-5ad3-76e7-6c677310034a Address:127.0.0.1:43006}]
2019/12/06 06:39:47 [INFO]  raft: Node at 127.0.0.1:43006 [Follower] entering Follower state (Leader: "")
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.916595 [INFO] serf: EventMemberJoin: Node 2a0b825d-395e-5ad3-76e7-6c677310034a.dc1 127.0.0.1
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.920370 [INFO] serf: EventMemberJoin: Node 2a0b825d-395e-5ad3-76e7-6c677310034a 127.0.0.1
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.921884 [INFO] consul: Handled member-join event for server "Node 2a0b825d-395e-5ad3-76e7-6c677310034a.dc1" in area "wan"
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.922393 [INFO] consul: Adding LAN server Node 2a0b825d-395e-5ad3-76e7-6c677310034a (Addr: tcp/127.0.0.1:43006) (DC: dc1)
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.922874 [INFO] agent: Started DNS server 127.0.0.1:43001 (udp)
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.923091 [INFO] agent: Started DNS server 127.0.0.1:43001 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.926175 [INFO] agent: Started HTTP server on 127.0.0.1:43002 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:47.926371 [INFO] agent: started state syncer
2019/12/06 06:39:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:39:47 [INFO]  raft: Node at 127.0.0.1:43006 [Candidate] entering Candidate state in term 2
2019/12/06 06:39:48 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:39:48 [INFO]  raft: Node at 127.0.0.1:43006 [Leader] entering Leader state
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:48.428482 [INFO] consul: cluster leadership acquired
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:48.429058 [INFO] consul: New leader elected: Node 2a0b825d-395e-5ad3-76e7-6c677310034a
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:48.787690 [INFO] agent: Synced node info
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:48.787843 [DEBUG] agent: Node info in sync
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:48.811056 [DEBUG] agent: Node info in sync
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:49.637040 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:49.637510 [DEBUG] consul: Skipping self join check for "Node 2a0b825d-395e-5ad3-76e7-6c677310034a" since the cluster is too small
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:49.637665 [INFO] consul: member 'Node 2a0b825d-395e-5ad3-76e7-6c677310034a' joined, marking health alive
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:49.830241 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (4.42777ms) from=127.0.0.1:60642
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.029387 [DEBUG] http: Request PUT /v1/operator/autopilot/configuration?cas=5 (189.818119ms) from=127.0.0.1:60642
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.030779 [INFO] agent: Requesting shutdown
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.030867 [INFO] consul: shutting down server
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.030931 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.094776 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.153176 [INFO] manager: shutting down
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.153605 [INFO] agent: consul server down
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.153663 [INFO] agent: shutdown complete
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.153720 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.153860 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (udp)
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.154019 [INFO] agent: Stopping HTTP server 127.0.0.1:43002 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.154632 [INFO] agent: Waiting for endpoints to shut down
TestOperatorAutopilotSetConfigCommand - 2019/12/06 06:39:50.154730 [INFO] agent: Endpoints down
--- PASS: TestOperatorAutopilotSetConfigCommand (3.26s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot/set	3.619s
=== RUN   TestOperatorRaftCommand_noTabs
=== PAUSE TestOperatorRaftCommand_noTabs
=== CONT  TestOperatorRaftCommand_noTabs
--- PASS: TestOperatorRaftCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft	0.102s
=== RUN   TestOperatorRaftListPeersCommand_noTabs
=== PAUSE TestOperatorRaftListPeersCommand_noTabs
=== RUN   TestOperatorRaftListPeersCommand
=== PAUSE TestOperatorRaftListPeersCommand
=== CONT  TestOperatorRaftListPeersCommand_noTabs
=== CONT  TestOperatorRaftListPeersCommand
--- PASS: TestOperatorRaftListPeersCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:16.431651 [WARN] agent: Node name "Node ffe43c7f-3dcb-bd48-125d-304c3ded3040" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:16.432636 [DEBUG] tlsutil: Update with version 1
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:16.445050 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:40:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ffe43c7f-3dcb-bd48-125d-304c3ded3040 Address:127.0.0.1:37006}]
2019/12/06 06:40:17 [INFO]  raft: Node at 127.0.0.1:37006 [Follower] entering Follower state (Leader: "")
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.283939 [INFO] serf: EventMemberJoin: Node ffe43c7f-3dcb-bd48-125d-304c3ded3040.dc1 127.0.0.1
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.291951 [INFO] serf: EventMemberJoin: Node ffe43c7f-3dcb-bd48-125d-304c3ded3040 127.0.0.1
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.293028 [INFO] consul: Adding LAN server Node ffe43c7f-3dcb-bd48-125d-304c3ded3040 (Addr: tcp/127.0.0.1:37006) (DC: dc1)
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.293184 [INFO] consul: Handled member-join event for server "Node ffe43c7f-3dcb-bd48-125d-304c3ded3040.dc1" in area "wan"
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.293787 [INFO] agent: Started DNS server 127.0.0.1:37001 (tcp)
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.294300 [INFO] agent: Started DNS server 127.0.0.1:37001 (udp)
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.297388 [INFO] agent: Started HTTP server on 127.0.0.1:37002 (tcp)
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.297536 [INFO] agent: started state syncer
2019/12/06 06:40:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:40:17 [INFO]  raft: Node at 127.0.0.1:37006 [Candidate] entering Candidate state in term 2
2019/12/06 06:40:17 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:40:17 [INFO]  raft: Node at 127.0.0.1:37006 [Leader] entering Leader state
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.796610 [INFO] consul: cluster leadership acquired
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:17.797193 [INFO] consul: New leader elected: Node ffe43c7f-3dcb-bd48-125d-304c3ded3040
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.206977 [DEBUG] http: Request GET /v1/operator/raft/configuration (65.503203ms) from=127.0.0.1:33996
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.226148 [INFO] agent: Requesting shutdown
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.226248 [INFO] consul: shutting down server
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.226308 [WARN] serf: Shutdown without a Leave
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.361954 [WARN] serf: Shutdown without a Leave
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.421324 [INFO] agent: Synced node info
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.421453 [DEBUG] agent: Node info in sync
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.495259 [INFO] manager: shutting down
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.628663 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.628935 [INFO] agent: consul server down
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.628986 [INFO] agent: shutdown complete
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.629040 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (tcp)
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.629165 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (udp)
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.629288 [ERR] consul: failed to establish leadership: raft is already shutdown
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.629581 [INFO] agent: Stopping HTTP server 127.0.0.1:37002 (tcp)
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.630181 [INFO] agent: Waiting for endpoints to shut down
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.630343 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.630406 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestOperatorRaftListPeersCommand - 2019/12/06 06:40:18.630429 [INFO] agent: Endpoints down
--- PASS: TestOperatorRaftListPeersCommand (2.35s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft/listpeers	2.724s
=== RUN   TestOperatorRaftRemovePeerCommand_noTabs
=== PAUSE TestOperatorRaftRemovePeerCommand_noTabs
=== RUN   TestOperatorRaftRemovePeerCommand
=== PAUSE TestOperatorRaftRemovePeerCommand
=== CONT  TestOperatorRaftRemovePeerCommand_noTabs
=== CONT  TestOperatorRaftRemovePeerCommand
--- PASS: TestOperatorRaftRemovePeerCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:24.540989 [WARN] agent: Node name "Node c27b2e3f-e1ed-219d-a4e3-1d7c208a33e5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:24.542150 [DEBUG] tlsutil: Update with version 1
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:24.549760 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:40:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c27b2e3f-e1ed-219d-a4e3-1d7c208a33e5 Address:127.0.0.1:13006}]
2019/12/06 06:40:25 [INFO]  raft: Node at 127.0.0.1:13006 [Follower] entering Follower state (Leader: "")
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:25.725592 [INFO] serf: EventMemberJoin: Node c27b2e3f-e1ed-219d-a4e3-1d7c208a33e5.dc1 127.0.0.1
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:25.730323 [INFO] serf: EventMemberJoin: Node c27b2e3f-e1ed-219d-a4e3-1d7c208a33e5 127.0.0.1
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:25.731487 [INFO] consul: Adding LAN server Node c27b2e3f-e1ed-219d-a4e3-1d7c208a33e5 (Addr: tcp/127.0.0.1:13006) (DC: dc1)
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:25.732272 [INFO] consul: Handled member-join event for server "Node c27b2e3f-e1ed-219d-a4e3-1d7c208a33e5.dc1" in area "wan"
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:25.733817 [INFO] agent: Started DNS server 127.0.0.1:13001 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:25.734242 [INFO] agent: Started DNS server 127.0.0.1:13001 (udp)
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:25.737188 [INFO] agent: Started HTTP server on 127.0.0.1:13002 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:25.737365 [INFO] agent: started state syncer
2019/12/06 06:40:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:40:25 [INFO]  raft: Node at 127.0.0.1:13006 [Candidate] entering Candidate state in term 2
2019/12/06 06:40:26 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:40:26 [INFO]  raft: Node at 127.0.0.1:13006 [Leader] entering Leader state
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.421999 [INFO] consul: cluster leadership acquired
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.422539 [INFO] consul: New leader elected: Node c27b2e3f-e1ed-219d-a4e3-1d7c208a33e5
=== RUN   TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_directly
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.720848 [ERR] http: Request DELETE /v1/operator/raft/peer?address=nope, error: address "nope" was not found in the Raft configuration from=127.0.0.1:59150
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.739434 [DEBUG] http: Request DELETE /v1/operator/raft/peer?address=nope (209.554582ms) from=127.0.0.1:59150
=== RUN   TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_with_-id
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.862544 [ERR] http: Request DELETE /v1/operator/raft/peer?id=nope, error: id "nope" was not found in the Raft configuration from=127.0.0.1:59152
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.863183 [INFO] agent: Synced node info
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.863692 [DEBUG] http: Request DELETE /v1/operator/raft/peer?id=nope (115.728715ms) from=127.0.0.1:59152
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.865882 [INFO] agent: Requesting shutdown
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.865976 [INFO] consul: shutting down server
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:26.866021 [WARN] serf: Shutdown without a Leave
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.020539 [WARN] serf: Shutdown without a Leave
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.178874 [INFO] manager: shutting down
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.383693 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.384018 [INFO] agent: consul server down
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.384074 [INFO] agent: shutdown complete
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.384144 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.385186 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (udp)
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.385378 [INFO] agent: Stopping HTTP server 127.0.0.1:13002 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.386176 [INFO] agent: Waiting for endpoints to shut down
TestOperatorRaftRemovePeerCommand - 2019/12/06 06:40:27.386276 [INFO] agent: Endpoints down
--- PASS: TestOperatorRaftRemovePeerCommand (3.00s)
    --- PASS: TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_directly (0.22s)
    --- PASS: TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_with_-id (0.12s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft/removepeer	3.380s
=== RUN   TestReloadCommand_noTabs
=== PAUSE TestReloadCommand_noTabs
=== RUN   TestReloadCommand
=== PAUSE TestReloadCommand
=== CONT  TestReloadCommand_noTabs
=== CONT  TestReloadCommand
--- PASS: TestReloadCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestReloadCommand - 2019/12/06 06:40:39.326993 [WARN] agent: Node name "Node c1e7c956-00e2-ec01-545a-9a5529277df6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestReloadCommand - 2019/12/06 06:40:39.327890 [DEBUG] tlsutil: Update with version 1
TestReloadCommand - 2019/12/06 06:40:39.334387 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:40:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c1e7c956-00e2-ec01-545a-9a5529277df6 Address:127.0.0.1:52006}]
2019/12/06 06:40:40 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestReloadCommand - 2019/12/06 06:40:40.391994 [INFO] serf: EventMemberJoin: Node c1e7c956-00e2-ec01-545a-9a5529277df6.dc1 127.0.0.1
TestReloadCommand - 2019/12/06 06:40:40.395923 [INFO] serf: EventMemberJoin: Node c1e7c956-00e2-ec01-545a-9a5529277df6 127.0.0.1
TestReloadCommand - 2019/12/06 06:40:40.396986 [INFO] consul: Handled member-join event for server "Node c1e7c956-00e2-ec01-545a-9a5529277df6.dc1" in area "wan"
TestReloadCommand - 2019/12/06 06:40:40.397077 [INFO] consul: Adding LAN server Node c1e7c956-00e2-ec01-545a-9a5529277df6 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestReloadCommand - 2019/12/06 06:40:40.410740 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestReloadCommand - 2019/12/06 06:40:40.411108 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestReloadCommand - 2019/12/06 06:40:40.413931 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestReloadCommand - 2019/12/06 06:40:40.414108 [INFO] agent: started state syncer
2019/12/06 06:40:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:40:40 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/12/06 06:40:40 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:40:40 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestReloadCommand - 2019/12/06 06:40:40.904419 [INFO] consul: cluster leadership acquired
TestReloadCommand - 2019/12/06 06:40:40.904935 [INFO] consul: New leader elected: Node c1e7c956-00e2-ec01-545a-9a5529277df6
TestReloadCommand - 2019/12/06 06:40:41.229606 [DEBUG] http: Request PUT /v1/agent/reload (56.335µs) from=127.0.0.1:37372
TestReloadCommand - 2019/12/06 06:40:41.230244 [INFO] agent: Requesting shutdown
TestReloadCommand - 2019/12/06 06:40:41.230317 [INFO] consul: shutting down server
TestReloadCommand - 2019/12/06 06:40:41.230361 [WARN] serf: Shutdown without a Leave
TestReloadCommand - 2019/12/06 06:40:41.335645 [INFO] agent: Synced node info
TestReloadCommand - 2019/12/06 06:40:41.488701 [WARN] serf: Shutdown without a Leave
TestReloadCommand - 2019/12/06 06:40:41.595590 [INFO] manager: shutting down
TestReloadCommand - 2019/12/06 06:40:41.662336 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestReloadCommand - 2019/12/06 06:40:41.662553 [INFO] agent: consul server down
TestReloadCommand - 2019/12/06 06:40:41.662605 [INFO] agent: shutdown complete
TestReloadCommand - 2019/12/06 06:40:41.662656 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestReloadCommand - 2019/12/06 06:40:41.662789 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestReloadCommand - 2019/12/06 06:40:41.662881 [ERR] consul: failed to establish leadership: raft is already shutdown
TestReloadCommand - 2019/12/06 06:40:41.662930 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestReloadCommand - 2019/12/06 06:40:41.663452 [INFO] agent: Waiting for endpoints to shut down
TestReloadCommand - 2019/12/06 06:40:41.663543 [INFO] agent: Endpoints down
--- PASS: TestReloadCommand (2.41s)
PASS
ok  	github.com/hashicorp/consul/command/reload	2.874s
=== RUN   TestRTTCommand_noTabs
=== PAUSE TestRTTCommand_noTabs
=== RUN   TestRTTCommand_BadArgs
=== PAUSE TestRTTCommand_BadArgs
=== RUN   TestRTTCommand_LAN
=== PAUSE TestRTTCommand_LAN
=== RUN   TestRTTCommand_WAN
=== PAUSE TestRTTCommand_WAN
=== CONT  TestRTTCommand_noTabs
=== CONT  TestRTTCommand_WAN
=== CONT  TestRTTCommand_LAN
--- PASS: TestRTTCommand_noTabs (0.02s)
=== CONT  TestRTTCommand_BadArgs
=== RUN   TestRTTCommand_BadArgs/#00
=== RUN   TestRTTCommand_BadArgs/node1_node2_node3
=== RUN   TestRTTCommand_BadArgs/-wan_node1_node2
=== RUN   TestRTTCommand_BadArgs/-wan_node1.dc1_node2
=== RUN   TestRTTCommand_BadArgs/-wan_node1_node2.dc1
--- PASS: TestRTTCommand_BadArgs (0.03s)
    --- PASS: TestRTTCommand_BadArgs/#00 (0.01s)
    --- PASS: TestRTTCommand_BadArgs/node1_node2_node3 (0.01s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1_node2 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1.dc1_node2 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1_node2.dc1 (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRTTCommand_LAN - 2019/12/06 06:40:42.725363 [WARN] agent: Node name "Node 1a47dc71-7e66-516b-6066-8d2770d87285" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRTTCommand_LAN - 2019/12/06 06:40:42.726803 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestRTTCommand_LAN - 2019/12/06 06:40:42.744706 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRTTCommand_WAN - 2019/12/06 06:40:42.744731 [WARN] agent: Node name "Node 870333f7-45d0-28ef-ca4b-ab0277714beb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRTTCommand_WAN - 2019/12/06 06:40:42.745132 [DEBUG] tlsutil: Update with version 1
TestRTTCommand_WAN - 2019/12/06 06:40:42.752048 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:40:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:870333f7-45d0-28ef-ca4b-ab0277714beb Address:127.0.0.1:17506}]
2019/12/06 06:40:44 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
2019/12/06 06:40:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1a47dc71-7e66-516b-6066-8d2770d87285 Address:127.0.0.1:17512}]
2019/12/06 06:40:44 [INFO]  raft: Node at 127.0.0.1:17512 [Follower] entering Follower state (Leader: "")
TestRTTCommand_LAN - 2019/12/06 06:40:44.328296 [INFO] serf: EventMemberJoin: Node 1a47dc71-7e66-516b-6066-8d2770d87285.dc1 127.0.0.1
TestRTTCommand_WAN - 2019/12/06 06:40:44.330235 [INFO] serf: EventMemberJoin: Node 870333f7-45d0-28ef-ca4b-ab0277714beb.dc1 127.0.0.1
TestRTTCommand_LAN - 2019/12/06 06:40:44.342240 [INFO] serf: EventMemberJoin: Node 1a47dc71-7e66-516b-6066-8d2770d87285 127.0.0.1
TestRTTCommand_WAN - 2019/12/06 06:40:44.344474 [INFO] serf: EventMemberJoin: Node 870333f7-45d0-28ef-ca4b-ab0277714beb 127.0.0.1
TestRTTCommand_WAN - 2019/12/06 06:40:44.345490 [INFO] consul: Adding LAN server Node 870333f7-45d0-28ef-ca4b-ab0277714beb (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestRTTCommand_WAN - 2019/12/06 06:40:44.345901 [INFO] consul: Handled member-join event for server "Node 870333f7-45d0-28ef-ca4b-ab0277714beb.dc1" in area "wan"
TestRTTCommand_LAN - 2019/12/06 06:40:44.346550 [INFO] consul: Adding LAN server Node 1a47dc71-7e66-516b-6066-8d2770d87285 (Addr: tcp/127.0.0.1:17512) (DC: dc1)
TestRTTCommand_LAN - 2019/12/06 06:40:44.347014 [INFO] consul: Handled member-join event for server "Node 1a47dc71-7e66-516b-6066-8d2770d87285.dc1" in area "wan"
TestRTTCommand_LAN - 2019/12/06 06:40:44.350091 [INFO] agent: Started DNS server 127.0.0.1:17507 (tcp)
TestRTTCommand_WAN - 2019/12/06 06:40:44.350358 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestRTTCommand_WAN - 2019/12/06 06:40:44.350413 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestRTTCommand_LAN - 2019/12/06 06:40:44.351005 [INFO] agent: Started DNS server 127.0.0.1:17507 (udp)
TestRTTCommand_WAN - 2019/12/06 06:40:44.372673 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestRTTCommand_WAN - 2019/12/06 06:40:44.372839 [INFO] agent: started state syncer
2019/12/06 06:40:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:40:44 [INFO]  raft: Node at 127.0.0.1:17512 [Candidate] entering Candidate state in term 2
TestRTTCommand_LAN - 2019/12/06 06:40:44.403073 [INFO] agent: Started HTTP server on 127.0.0.1:17508 (tcp)
TestRTTCommand_LAN - 2019/12/06 06:40:44.403284 [INFO] agent: started state syncer
2019/12/06 06:40:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:40:44 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/12/06 06:40:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:40:45 [INFO]  raft: Node at 127.0.0.1:17512 [Leader] entering Leader state
TestRTTCommand_LAN - 2019/12/06 06:40:45.254681 [INFO] consul: cluster leadership acquired
TestRTTCommand_LAN - 2019/12/06 06:40:45.255572 [INFO] consul: New leader elected: Node 1a47dc71-7e66-516b-6066-8d2770d87285
2019/12/06 06:40:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:40:45 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestRTTCommand_WAN - 2019/12/06 06:40:45.330885 [INFO] consul: cluster leadership acquired
TestRTTCommand_WAN - 2019/12/06 06:40:45.331363 [INFO] consul: New leader elected: Node 870333f7-45d0-28ef-ca4b-ab0277714beb
TestRTTCommand_LAN - 2019/12/06 06:40:45.855238 [INFO] agent: Synced node info
TestRTTCommand_LAN - 2019/12/06 06:40:45.855376 [DEBUG] agent: Node info in sync
TestRTTCommand_WAN - 2019/12/06 06:40:45.929920 [INFO] agent: Synced node info
TestRTTCommand_WAN - 2019/12/06 06:40:45.945026 [DEBUG] http: Request GET /v1/coordinate/datacenters (1.877377ms) from=127.0.0.1:43696
TestRTTCommand_WAN - 2019/12/06 06:40:46.130240 [DEBUG] http: Request GET /v1/agent/self (176.863481ms) from=127.0.0.1:43698
TestRTTCommand_WAN - 2019/12/06 06:40:46.163507 [DEBUG] http: Request GET /v1/coordinate/datacenters (1.154027ms) from=127.0.0.1:43698
TestRTTCommand_WAN - 2019/12/06 06:40:46.191284 [DEBUG] http: Request GET /v1/coordinate/datacenters (1.175027ms) from=127.0.0.1:43700
TestRTTCommand_WAN - 2019/12/06 06:40:46.193091 [INFO] agent: Requesting shutdown
TestRTTCommand_WAN - 2019/12/06 06:40:46.193195 [INFO] consul: shutting down server
TestRTTCommand_WAN - 2019/12/06 06:40:46.193247 [WARN] serf: Shutdown without a Leave
TestRTTCommand_WAN - 2019/12/06 06:40:46.273329 [WARN] serf: Shutdown without a Leave
TestRTTCommand_WAN - 2019/12/06 06:40:46.414081 [INFO] manager: shutting down
TestRTTCommand_WAN - 2019/12/06 06:40:46.562679 [INFO] agent: consul server down
TestRTTCommand_WAN - 2019/12/06 06:40:46.562761 [INFO] agent: shutdown complete
TestRTTCommand_WAN - 2019/12/06 06:40:46.562821 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestRTTCommand_WAN - 2019/12/06 06:40:46.562962 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestRTTCommand_WAN - 2019/12/06 06:40:46.563120 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestRTTCommand_WAN - 2019/12/06 06:40:46.564048 [INFO] agent: Waiting for endpoints to shut down
TestRTTCommand_WAN - 2019/12/06 06:40:46.564264 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestRTTCommand_WAN - 2019/12/06 06:40:46.564465 [INFO] agent: Endpoints down
--- PASS: TestRTTCommand_WAN (4.08s)
TestRTTCommand_WAN - 2019/12/06 06:40:46.564691 [ERR] consul: failed to transfer leadership attempt 0/3: raft is already shutdown
TestRTTCommand_WAN - 2019/12/06 06:40:46.564754 [ERR] consul: failed to transfer leadership attempt 1/3: raft is already shutdown
TestRTTCommand_LAN - 2019/12/06 06:40:46.571519 [DEBUG] http: Request GET /v1/coordinate/nodes (932.022µs) from=127.0.0.1:34224
TestRTTCommand_LAN - 2019/12/06 06:40:46.602884 [DEBUG] http: Request GET /v1/coordinate/nodes (1.25303ms) from=127.0.0.1:34226
TestRTTCommand_LAN - 2019/12/06 06:40:46.633944 [DEBUG] http: Request GET /v1/coordinate/nodes (969.357µs) from=127.0.0.1:34228
TestRTTCommand_LAN - 2019/12/06 06:40:46.664864 [DEBUG] http: Request GET /v1/coordinate/nodes (1.486035ms) from=127.0.0.1:34230
TestRTTCommand_LAN - 2019/12/06 06:40:46.695361 [DEBUG] http: Request GET /v1/coordinate/nodes (1.065692ms) from=127.0.0.1:34232
TestRTTCommand_LAN - 2019/12/06 06:40:46.726845 [DEBUG] http: Request GET /v1/coordinate/nodes (1.500369ms) from=127.0.0.1:34234
TestRTTCommand_LAN - 2019/12/06 06:40:46.757345 [DEBUG] http: Request GET /v1/coordinate/nodes (850.353µs) from=127.0.0.1:34236
TestRTTCommand_LAN - 2019/12/06 06:40:46.788366 [DEBUG] http: Request GET /v1/coordinate/nodes (1.431367ms) from=127.0.0.1:34238
TestRTTCommand_LAN - 2019/12/06 06:40:46.818625 [DEBUG] http: Request GET /v1/coordinate/nodes (907.688µs) from=127.0.0.1:34240
TestRTTCommand_LAN - 2019/12/06 06:40:46.852307 [DEBUG] http: Request GET /v1/coordinate/nodes (1.00569ms) from=127.0.0.1:34242
TestRTTCommand_LAN - 2019/12/06 06:40:46.883647 [DEBUG] http: Request GET /v1/coordinate/nodes (1.031357ms) from=127.0.0.1:34244
TestRTTCommand_LAN - 2019/12/06 06:40:46.913850 [DEBUG] http: Request GET /v1/coordinate/nodes (841.02µs) from=127.0.0.1:34246
TestRTTCommand_LAN - 2019/12/06 06:40:46.944105 [DEBUG] http: Request GET /v1/coordinate/nodes (990.69µs) from=127.0.0.1:34248
TestRTTCommand_LAN - 2019/12/06 06:40:46.974808 [DEBUG] http: Request GET /v1/coordinate/nodes (1.293364ms) from=127.0.0.1:34250
TestRTTCommand_LAN - 2019/12/06 06:40:47.006760 [DEBUG] http: Request GET /v1/coordinate/nodes (894.687µs) from=127.0.0.1:34252
TestRTTCommand_LAN - 2019/12/06 06:40:47.037194 [DEBUG] http: Request GET /v1/coordinate/nodes (925.688µs) from=127.0.0.1:34254
TestRTTCommand_LAN - 2019/12/06 06:40:47.067858 [DEBUG] http: Request GET /v1/coordinate/nodes (869.687µs) from=127.0.0.1:34256
TestRTTCommand_LAN - 2019/12/06 06:40:47.230456 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRTTCommand_LAN - 2019/12/06 06:40:47.235273 [DEBUG] consul: Skipping self join check for "Node 1a47dc71-7e66-516b-6066-8d2770d87285" since the cluster is too small
TestRTTCommand_LAN - 2019/12/06 06:40:47.235564 [INFO] consul: member 'Node 1a47dc71-7e66-516b-6066-8d2770d87285' joined, marking health alive
TestRTTCommand_LAN - 2019/12/06 06:40:47.242198 [DEBUG] http: Request GET /v1/agent/self (168.132276ms) from=127.0.0.1:34258
TestRTTCommand_LAN - 2019/12/06 06:40:47.254874 [DEBUG] http: Request GET /v1/coordinate/nodes (895.354µs) from=127.0.0.1:34258
TestRTTCommand_LAN - 2019/12/06 06:40:47.261458 [DEBUG] http: Request GET /v1/coordinate/nodes (951.355µs) from=127.0.0.1:34260
TestRTTCommand_LAN - 2019/12/06 06:40:47.263196 [INFO] agent: Requesting shutdown
TestRTTCommand_LAN - 2019/12/06 06:40:47.263289 [INFO] consul: shutting down server
TestRTTCommand_LAN - 2019/12/06 06:40:47.263351 [WARN] serf: Shutdown without a Leave
TestRTTCommand_LAN - 2019/12/06 06:40:47.291701 [DEBUG] agent: Node info in sync
TestRTTCommand_LAN - 2019/12/06 06:40:47.429010 [WARN] serf: Shutdown without a Leave
TestRTTCommand_LAN - 2019/12/06 06:40:47.546204 [INFO] agent: consul server down
TestRTTCommand_LAN - 2019/12/06 06:40:47.546288 [INFO] agent: shutdown complete
TestRTTCommand_LAN - 2019/12/06 06:40:47.546373 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (tcp)
TestRTTCommand_LAN - 2019/12/06 06:40:47.546546 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (udp)
TestRTTCommand_LAN - 2019/12/06 06:40:47.546720 [INFO] agent: Stopping HTTP server 127.0.0.1:17508 (tcp)
TestRTTCommand_LAN - 2019/12/06 06:40:47.554432 [INFO] manager: shutting down
TestRTTCommand_LAN - 2019/12/06 06:40:47.556304 [INFO] agent: Waiting for endpoints to shut down
TestRTTCommand_LAN - 2019/12/06 06:40:47.556541 [INFO] agent: Endpoints down
--- PASS: TestRTTCommand_LAN (5.06s)
PASS
ok  	github.com/hashicorp/consul/command/rtt	5.344s
=== RUN   TestDevModeHasNoServices
=== PAUSE TestDevModeHasNoServices
=== RUN   TestStructsToAgentService
=== PAUSE TestStructsToAgentService
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== CONT  TestDevModeHasNoServices
=== CONT  TestStructsToAgentService
=== RUN   TestStructsToAgentService/Basic_service_with_port
=== PAUSE TestStructsToAgentService/Basic_service_with_port
=== RUN   TestStructsToAgentService/Service_with_a_check
=== PAUSE TestStructsToAgentService/Service_with_a_check
=== RUN   TestStructsToAgentService/Service_with_checks
=== PAUSE TestStructsToAgentService/Service_with_checks
=== RUN   TestStructsToAgentService/Proxy_service
=== PAUSE TestStructsToAgentService/Proxy_service
=== CONT  TestStructsToAgentService/Basic_service_with_port
=== CONT  TestStructsToAgentService/Proxy_service
=== CONT  TestStructsToAgentService/Service_with_checks
=== CONT  TestStructsToAgentService/Service_with_a_check
=== CONT  TestCommand_noTabs
--- PASS: TestCommand_noTabs (0.00s)
--- PASS: TestStructsToAgentService (0.00s)
    --- PASS: TestStructsToAgentService/Basic_service_with_port (0.00s)
    --- PASS: TestStructsToAgentService/Proxy_service (0.00s)
    --- PASS: TestStructsToAgentService/Service_with_checks (0.00s)
    --- PASS: TestStructsToAgentService/Service_with_a_check (0.00s)
--- PASS: TestDevModeHasNoServices (0.07s)
PASS
ok  	github.com/hashicorp/consul/command/services	0.231s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_File_id
=== PAUSE TestCommand_File_id
=== RUN   TestCommand_File_nameOnly
=== PAUSE TestCommand_File_nameOnly
=== RUN   TestCommand_Flag
=== PAUSE TestCommand_Flag
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_File_nameOnly
=== CONT  TestCommand_Flag
=== CONT  TestCommand_File_id
=== CONT  TestCommand_Validation
--- PASS: TestCommand_noTabs (0.00s)
=== RUN   TestCommand_Validation/no_args_or_id
=== RUN   TestCommand_Validation/args_and_-id
--- PASS: TestCommand_Validation (0.03s)
    --- PASS: TestCommand_Validation/no_args_or_id (0.00s)
    --- PASS: TestCommand_Validation/args_and_-id (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File_id - 2019/12/06 06:41:14.993156 [WARN] agent: Node name "Node 64527932-0bb8-390f-2ec6-755c0e10efe3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File_id - 2019/12/06 06:41:14.993957 [DEBUG] tlsutil: Update with version 1
TestCommand_File_id - 2019/12/06 06:41:15.002473 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File_nameOnly - 2019/12/06 06:41:15.018092 [WARN] agent: Node name "Node 9063a73f-318d-0b76-6fa4-da7886a749e3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File_nameOnly - 2019/12/06 06:41:15.018589 [DEBUG] tlsutil: Update with version 1
TestCommand_Flag - 2019/12/06 06:41:15.018613 [WARN] agent: Node name "Node 7196e516-c0d5-5287-cbcd-58c5f643f068" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_Flag - 2019/12/06 06:41:15.019373 [DEBUG] tlsutil: Update with version 1
TestCommand_File_nameOnly - 2019/12/06 06:41:15.020959 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_Flag - 2019/12/06 06:41:15.021805 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:41:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7196e516-c0d5-5287-cbcd-58c5f643f068 Address:127.0.0.1:44518}]
2019/12/06 06:41:16 [INFO]  raft: Node at 127.0.0.1:44518 [Follower] entering Follower state (Leader: "")
TestCommand_Flag - 2019/12/06 06:41:16.061043 [INFO] serf: EventMemberJoin: Node 7196e516-c0d5-5287-cbcd-58c5f643f068.dc1 127.0.0.1
TestCommand_Flag - 2019/12/06 06:41:16.071914 [INFO] serf: EventMemberJoin: Node 7196e516-c0d5-5287-cbcd-58c5f643f068 127.0.0.1
TestCommand_Flag - 2019/12/06 06:41:16.073444 [INFO] consul: Adding LAN server Node 7196e516-c0d5-5287-cbcd-58c5f643f068 (Addr: tcp/127.0.0.1:44518) (DC: dc1)
TestCommand_Flag - 2019/12/06 06:41:16.074003 [INFO] consul: Handled member-join event for server "Node 7196e516-c0d5-5287-cbcd-58c5f643f068.dc1" in area "wan"
TestCommand_Flag - 2019/12/06 06:41:16.074919 [INFO] agent: Started DNS server 127.0.0.1:44513 (udp)
TestCommand_Flag - 2019/12/06 06:41:16.076973 [INFO] agent: Started DNS server 127.0.0.1:44513 (tcp)
TestCommand_Flag - 2019/12/06 06:41:16.085567 [INFO] agent: Started HTTP server on 127.0.0.1:44514 (tcp)
TestCommand_Flag - 2019/12/06 06:41:16.085918 [INFO] agent: started state syncer
2019/12/06 06:41:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:41:16 [INFO]  raft: Node at 127.0.0.1:44518 [Candidate] entering Candidate state in term 2
2019/12/06 06:41:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:64527932-0bb8-390f-2ec6-755c0e10efe3 Address:127.0.0.1:44512}]
2019/12/06 06:41:16 [INFO]  raft: Node at 127.0.0.1:44512 [Follower] entering Follower state (Leader: "")
2019/12/06 06:41:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9063a73f-318d-0b76-6fa4-da7886a749e3 Address:127.0.0.1:44506}]
2019/12/06 06:41:16 [INFO]  raft: Node at 127.0.0.1:44506 [Follower] entering Follower state (Leader: "")
TestCommand_File_nameOnly - 2019/12/06 06:41:16.325769 [INFO] serf: EventMemberJoin: Node 9063a73f-318d-0b76-6fa4-da7886a749e3.dc1 127.0.0.1
TestCommand_File_id - 2019/12/06 06:41:16.325770 [INFO] serf: EventMemberJoin: Node 64527932-0bb8-390f-2ec6-755c0e10efe3.dc1 127.0.0.1
TestCommand_File_nameOnly - 2019/12/06 06:41:16.329651 [INFO] serf: EventMemberJoin: Node 9063a73f-318d-0b76-6fa4-da7886a749e3 127.0.0.1
TestCommand_File_id - 2019/12/06 06:41:16.329651 [INFO] serf: EventMemberJoin: Node 64527932-0bb8-390f-2ec6-755c0e10efe3 127.0.0.1
TestCommand_File_nameOnly - 2019/12/06 06:41:16.331425 [INFO] consul: Adding LAN server Node 9063a73f-318d-0b76-6fa4-da7886a749e3 (Addr: tcp/127.0.0.1:44506) (DC: dc1)
TestCommand_File_nameOnly - 2019/12/06 06:41:16.331657 [INFO] consul: Handled member-join event for server "Node 9063a73f-318d-0b76-6fa4-da7886a749e3.dc1" in area "wan"
TestCommand_File_id - 2019/12/06 06:41:16.332836 [INFO] consul: Adding LAN server Node 64527932-0bb8-390f-2ec6-755c0e10efe3 (Addr: tcp/127.0.0.1:44512) (DC: dc1)
TestCommand_File_id - 2019/12/06 06:41:16.333112 [INFO] consul: Handled member-join event for server "Node 64527932-0bb8-390f-2ec6-755c0e10efe3.dc1" in area "wan"
TestCommand_File_nameOnly - 2019/12/06 06:41:16.370605 [INFO] agent: Started DNS server 127.0.0.1:44501 (udp)
TestCommand_File_nameOnly - 2019/12/06 06:41:16.371035 [INFO] agent: Started DNS server 127.0.0.1:44501 (tcp)
TestCommand_File_id - 2019/12/06 06:41:16.371234 [INFO] agent: Started DNS server 127.0.0.1:44507 (udp)
TestCommand_File_id - 2019/12/06 06:41:16.371300 [INFO] agent: Started DNS server 127.0.0.1:44507 (tcp)
TestCommand_File_nameOnly - 2019/12/06 06:41:16.373364 [INFO] agent: Started HTTP server on 127.0.0.1:44502 (tcp)
TestCommand_File_id - 2019/12/06 06:41:16.373478 [INFO] agent: Started HTTP server on 127.0.0.1:44508 (tcp)
TestCommand_File_id - 2019/12/06 06:41:16.373560 [INFO] agent: started state syncer
2019/12/06 06:41:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:41:16 [INFO]  raft: Node at 127.0.0.1:44512 [Candidate] entering Candidate state in term 2
2019/12/06 06:41:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:41:16 [INFO]  raft: Node at 127.0.0.1:44506 [Candidate] entering Candidate state in term 2
TestCommand_File_nameOnly - 2019/12/06 06:41:16.373489 [INFO] agent: started state syncer
2019/12/06 06:41:16 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:41:16 [INFO]  raft: Node at 127.0.0.1:44518 [Leader] entering Leader state
TestCommand_Flag - 2019/12/06 06:41:16.680050 [INFO] consul: cluster leadership acquired
TestCommand_Flag - 2019/12/06 06:41:16.680766 [INFO] consul: New leader elected: Node 7196e516-c0d5-5287-cbcd-58c5f643f068
2019/12/06 06:41:16 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:41:16 [INFO]  raft: Node at 127.0.0.1:44506 [Leader] entering Leader state
TestCommand_File_nameOnly - 2019/12/06 06:41:16.944074 [INFO] consul: cluster leadership acquired
TestCommand_File_nameOnly - 2019/12/06 06:41:16.944898 [INFO] consul: New leader elected: Node 9063a73f-318d-0b76-6fa4-da7886a749e3
2019/12/06 06:41:17 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:41:17 [INFO]  raft: Node at 127.0.0.1:44512 [Leader] entering Leader state
TestCommand_File_id - 2019/12/06 06:41:17.227275 [INFO] consul: cluster leadership acquired
TestCommand_File_id - 2019/12/06 06:41:17.227994 [INFO] consul: New leader elected: Node 64527932-0bb8-390f-2ec6-755c0e10efe3
TestCommand_Flag - 2019/12/06 06:41:17.422948 [INFO] agent: Synced service "web"
TestCommand_Flag - 2019/12/06 06:41:17.423060 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/06 06:41:17.426912 [DEBUG] http: Request PUT /v1/agent/service/register (667.436653ms) from=127.0.0.1:57728
TestCommand_File_nameOnly - 2019/12/06 06:41:17.722564 [INFO] agent: Synced node info
TestCommand_File_id - 2019/12/06 06:41:17.797700 [INFO] agent: Synced service "web"
TestCommand_File_id - 2019/12/06 06:41:17.797792 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/06 06:41:17.798063 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/12/06 06:41:17.798115 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/06 06:41:17.798230 [DEBUG] http: Request PUT /v1/agent/service/register (517.146129ms) from=127.0.0.1:50340
TestCommand_File_id - 2019/12/06 06:41:17.879023 [DEBUG] agent: Service "web" in sync
TestCommand_File_nameOnly - 2019/12/06 06:41:18.292314 [INFO] agent: Synced service "web"
TestCommand_File_nameOnly - 2019/12/06 06:41:18.292403 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/06 06:41:18.292498 [DEBUG] http: Request PUT /v1/agent/service/register (966.85201ms) from=127.0.0.1:37970
TestCommand_File_nameOnly - 2019/12/06 06:41:18.409241 [DEBUG] agent: Service "web" in sync
TestCommand_Flag - 2019/12/06 06:41:18.409668 [INFO] agent: Synced service "web"
TestCommand_Flag - 2019/12/06 06:41:18.409734 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/06 06:41:18.882151 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/12/06 06:41:18.885180 [INFO] agent: Synced service "db"
TestCommand_File_id - 2019/12/06 06:41:18.885264 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/06 06:41:18.885346 [DEBUG] http: Request PUT /v1/agent/service/register (1.081832706s) from=127.0.0.1:50340
TestCommand_File_id - 2019/12/06 06:41:18.941975 [DEBUG] agent: removed service "web"
TestCommand_File_nameOnly - 2019/12/06 06:41:19.198433 [INFO] agent: Synced service "db"
TestCommand_File_nameOnly - 2019/12/06 06:41:19.198533 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/06 06:41:19.198623 [DEBUG] http: Request PUT /v1/agent/service/register (903.910199ms) from=127.0.0.1:37970
TestCommand_File_nameOnly - 2019/12/06 06:41:19.273819 [DEBUG] agent: removed service "web"
TestCommand_File_id - 2019/12/06 06:41:19.457569 [INFO] agent: Deregistered service "web"
TestCommand_File_id - 2019/12/06 06:41:19.457655 [DEBUG] agent: Service "db" in sync
TestCommand_File_id - 2019/12/06 06:41:19.457689 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/06 06:41:19.457757 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (516.659117ms) from=127.0.0.1:50344
TestCommand_Flag - 2019/12/06 06:41:19.540699 [INFO] agent: Synced service "db"
TestCommand_Flag - 2019/12/06 06:41:19.540878 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/06 06:41:19.541025 [DEBUG] http: Request PUT /v1/agent/service/register (2.111474855s) from=127.0.0.1:57728
TestCommand_Flag - 2019/12/06 06:41:19.550249 [DEBUG] agent: removed service "web"
TestCommand_File_nameOnly - 2019/12/06 06:41:19.625244 [INFO] agent: Deregistered service "web"
TestCommand_File_nameOnly - 2019/12/06 06:41:19.625334 [DEBUG] agent: Service "db" in sync
TestCommand_File_nameOnly - 2019/12/06 06:41:19.625372 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/12/06 06:41:19.625443 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (352.322929ms) from=127.0.0.1:37974
TestCommand_File_nameOnly - 2019/12/06 06:41:19.639273 [DEBUG] http: Request GET /v1/agent/services (9.158548ms) from=127.0.0.1:37970
TestCommand_File_nameOnly - 2019/12/06 06:41:19.658596 [INFO] agent: Requesting shutdown
TestCommand_File_nameOnly - 2019/12/06 06:41:19.658732 [INFO] consul: shutting down server
TestCommand_File_nameOnly - 2019/12/06 06:41:19.658790 [WARN] serf: Shutdown without a Leave
TestCommand_File_nameOnly - 2019/12/06 06:41:19.822053 [WARN] serf: Shutdown without a Leave
TestCommand_File_id - 2019/12/06 06:41:19.823163 [INFO] agent: Deregistered service "web"
TestCommand_File_id - 2019/12/06 06:41:19.823255 [DEBUG] agent: Service "db" in sync
TestCommand_File_id - 2019/12/06 06:41:19.823295 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/06 06:41:19.823462 [DEBUG] agent: Service "db" in sync
TestCommand_File_id - 2019/12/06 06:41:19.823545 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/12/06 06:41:19.825130 [DEBUG] http: Request GET /v1/agent/services (365.447905ms) from=127.0.0.1:50340
TestCommand_File_id - 2019/12/06 06:41:19.831785 [INFO] agent: Requesting shutdown
TestCommand_File_id - 2019/12/06 06:41:19.831906 [INFO] consul: shutting down server
TestCommand_File_id - 2019/12/06 06:41:19.831955 [WARN] serf: Shutdown without a Leave
TestCommand_File_nameOnly - 2019/12/06 06:41:19.937887 [INFO] manager: shutting down
TestCommand_File_nameOnly - 2019/12/06 06:41:19.986004 [ERR] agent: failed to sync remote state: No cluster leader
TestCommand_Flag - 2019/12/06 06:41:20.005722 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_Flag - 2019/12/06 06:41:20.007342 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommand_Flag - 2019/12/06 06:41:20.007936 [DEBUG] consul: Skipping self join check for "Node 7196e516-c0d5-5287-cbcd-58c5f643f068" since the cluster is too small
TestCommand_Flag - 2019/12/06 06:41:20.008123 [INFO] consul: member 'Node 7196e516-c0d5-5287-cbcd-58c5f643f068' joined, marking health alive
TestCommand_File_id - 2019/12/06 06:41:20.008737 [WARN] serf: Shutdown without a Leave
TestCommand_Flag - 2019/12/06 06:41:20.013974 [INFO] agent: Deregistered service "web"
TestCommand_Flag - 2019/12/06 06:41:20.014071 [DEBUG] agent: Service "db" in sync
TestCommand_Flag - 2019/12/06 06:41:20.014112 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/06 06:41:20.014276 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (464.698899ms) from=127.0.0.1:57738
TestCommand_File_id - 2019/12/06 06:41:20.137977 [INFO] manager: shutting down
TestCommand_File_nameOnly - 2019/12/06 06:41:20.138013 [ERR] connect: Apply failed leadership lost while committing log
TestCommand_File_nameOnly - 2019/12/06 06:41:20.138125 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_File_nameOnly - 2019/12/06 06:41:20.138187 [INFO] agent: consul server down
TestCommand_File_nameOnly - 2019/12/06 06:41:20.138234 [INFO] agent: shutdown complete
TestCommand_File_nameOnly - 2019/12/06 06:41:20.138288 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (tcp)
TestCommand_File_nameOnly - 2019/12/06 06:41:20.138443 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (udp)
TestCommand_File_nameOnly - 2019/12/06 06:41:20.138618 [INFO] agent: Stopping HTTP server 127.0.0.1:44502 (tcp)
TestCommand_File_id - 2019/12/06 06:41:20.138832 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommand_File_id - 2019/12/06 06:41:20.139288 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestCommand_File_nameOnly - 2019/12/06 06:41:20.139483 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File_nameOnly - 2019/12/06 06:41:20.139564 [INFO] agent: Endpoints down
TestCommand_File_id - 2019/12/06 06:41:20.139576 [ERR] consul: failed to reconcile member: {Node 64527932-0bb8-390f-2ec6-755c0e10efe3 127.0.0.1 44510 map[acls:0 bootstrap:1 build:1.5.2: dc:dc1 id:64527932-0bb8-390f-2ec6-755c0e10efe3 port:44512 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:44511] alive 1 5 2 2 5 4}: raft is already shutdown
--- PASS: TestCommand_File_nameOnly (5.37s)
TestCommand_File_id - 2019/12/06 06:41:20.140139 [INFO] agent: consul server down
TestCommand_File_id - 2019/12/06 06:41:20.140195 [INFO] agent: shutdown complete
TestCommand_File_id - 2019/12/06 06:41:20.140256 [INFO] agent: Stopping DNS server 127.0.0.1:44507 (tcp)
TestCommand_File_id - 2019/12/06 06:41:20.140427 [INFO] agent: Stopping DNS server 127.0.0.1:44507 (udp)
TestCommand_File_id - 2019/12/06 06:41:20.140625 [INFO] agent: Stopping HTTP server 127.0.0.1:44508 (tcp)
TestCommand_File_id - 2019/12/06 06:41:20.141494 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File_id - 2019/12/06 06:41:20.141633 [INFO] agent: Endpoints down
--- PASS: TestCommand_File_id (5.37s)
TestCommand_Flag - 2019/12/06 06:41:20.388171 [INFO] agent: Deregistered service "web"
TestCommand_Flag - 2019/12/06 06:41:20.388253 [DEBUG] agent: Service "db" in sync
TestCommand_Flag - 2019/12/06 06:41:20.388288 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/06 06:41:20.388398 [DEBUG] agent: Service "db" in sync
TestCommand_Flag - 2019/12/06 06:41:20.388440 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/12/06 06:41:20.389472 [DEBUG] http: Request GET /v1/agent/services (372.861412ms) from=127.0.0.1:57728
TestCommand_Flag - 2019/12/06 06:41:20.391811 [INFO] agent: Requesting shutdown
TestCommand_Flag - 2019/12/06 06:41:20.391913 [INFO] consul: shutting down server
TestCommand_Flag - 2019/12/06 06:41:20.391963 [WARN] serf: Shutdown without a Leave
TestCommand_Flag - 2019/12/06 06:41:20.454478 [WARN] serf: Shutdown without a Leave
TestCommand_Flag - 2019/12/06 06:41:20.512928 [INFO] manager: shutting down
TestCommand_Flag - 2019/12/06 06:41:20.513613 [INFO] agent: consul server down
TestCommand_Flag - 2019/12/06 06:41:20.513677 [INFO] agent: shutdown complete
TestCommand_Flag - 2019/12/06 06:41:20.513736 [INFO] agent: Stopping DNS server 127.0.0.1:44513 (tcp)
TestCommand_Flag - 2019/12/06 06:41:20.513871 [INFO] agent: Stopping DNS server 127.0.0.1:44513 (udp)
TestCommand_Flag - 2019/12/06 06:41:20.514030 [INFO] agent: Stopping HTTP server 127.0.0.1:44514 (tcp)
TestCommand_Flag - 2019/12/06 06:41:20.514762 [INFO] agent: Waiting for endpoints to shut down
TestCommand_Flag - 2019/12/06 06:41:20.514890 [INFO] agent: Endpoints down
--- PASS: TestCommand_Flag (5.74s)
PASS
ok  	github.com/hashicorp/consul/command/services/deregister	6.286s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_File
=== PAUSE TestCommand_File
=== RUN   TestCommand_Flags
=== PAUSE TestCommand_Flags
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_Flags
=== CONT  TestCommand_File
--- PASS: TestCommand_noTabs (0.01s)
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/no_args_or_id
=== RUN   TestCommand_Validation/args_and_-name
--- PASS: TestCommand_Validation (0.03s)
    --- PASS: TestCommand_Validation/no_args_or_id (0.02s)
    --- PASS: TestCommand_Validation/args_and_-name (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File - 2019/12/06 06:41:30.085169 [WARN] agent: Node name "Node c49ab86e-4a4b-9c53-3817-2744bffae8d8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File - 2019/12/06 06:41:30.085939 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_Flags - 2019/12/06 06:41:30.087678 [WARN] agent: Node name "Node baf55f0a-0f12-850f-02c4-ed6f90ee5340" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_Flags - 2019/12/06 06:41:30.088473 [DEBUG] tlsutil: Update with version 1
TestCommand_File - 2019/12/06 06:41:30.106521 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_Flags - 2019/12/06 06:41:30.110984 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:41:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:baf55f0a-0f12-850f-02c4-ed6f90ee5340 Address:127.0.0.1:37012}]
2019/12/06 06:41:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c49ab86e-4a4b-9c53-3817-2744bffae8d8 Address:127.0.0.1:37006}]
2019/12/06 06:41:31 [INFO]  raft: Node at 127.0.0.1:37012 [Follower] entering Follower state (Leader: "")
2019/12/06 06:41:31 [INFO]  raft: Node at 127.0.0.1:37006 [Follower] entering Follower state (Leader: "")
TestCommand_Flags - 2019/12/06 06:41:31.045146 [INFO] serf: EventMemberJoin: Node baf55f0a-0f12-850f-02c4-ed6f90ee5340.dc1 127.0.0.1
TestCommand_Flags - 2019/12/06 06:41:31.051407 [INFO] serf: EventMemberJoin: Node baf55f0a-0f12-850f-02c4-ed6f90ee5340 127.0.0.1
TestCommand_File - 2019/12/06 06:41:31.064114 [INFO] serf: EventMemberJoin: Node c49ab86e-4a4b-9c53-3817-2744bffae8d8.dc1 127.0.0.1
TestCommand_File - 2019/12/06 06:41:31.073130 [INFO] serf: EventMemberJoin: Node c49ab86e-4a4b-9c53-3817-2744bffae8d8 127.0.0.1
2019/12/06 06:41:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:41:31 [INFO]  raft: Node at 127.0.0.1:37012 [Candidate] entering Candidate state in term 2
TestCommand_Flags - 2019/12/06 06:41:31.085145 [INFO] agent: Started DNS server 127.0.0.1:37007 (udp)
TestCommand_File - 2019/12/06 06:41:31.086471 [INFO] agent: Started DNS server 127.0.0.1:37001 (udp)
TestCommand_File - 2019/12/06 06:41:31.102970 [INFO] agent: Started DNS server 127.0.0.1:37001 (tcp)
TestCommand_File - 2019/12/06 06:41:31.104097 [INFO] consul: Handled member-join event for server "Node c49ab86e-4a4b-9c53-3817-2744bffae8d8.dc1" in area "wan"
TestCommand_File - 2019/12/06 06:41:31.104955 [INFO] consul: Adding LAN server Node c49ab86e-4a4b-9c53-3817-2744bffae8d8 (Addr: tcp/127.0.0.1:37006) (DC: dc1)
TestCommand_Flags - 2019/12/06 06:41:31.105639 [INFO] consul: Handled member-join event for server "Node baf55f0a-0f12-850f-02c4-ed6f90ee5340.dc1" in area "wan"
TestCommand_Flags - 2019/12/06 06:41:31.106344 [INFO] agent: Started DNS server 127.0.0.1:37007 (tcp)
2019/12/06 06:41:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:41:31 [INFO]  raft: Node at 127.0.0.1:37006 [Candidate] entering Candidate state in term 2
TestCommand_Flags - 2019/12/06 06:41:31.109805 [INFO] agent: Started HTTP server on 127.0.0.1:37008 (tcp)
TestCommand_Flags - 2019/12/06 06:41:31.110310 [INFO] agent: started state syncer
TestCommand_File - 2019/12/06 06:41:31.110094 [INFO] agent: Started HTTP server on 127.0.0.1:37002 (tcp)
TestCommand_File - 2019/12/06 06:41:31.111426 [INFO] agent: started state syncer
TestCommand_Flags - 2019/12/06 06:41:31.110353 [INFO] consul: Adding LAN server Node baf55f0a-0f12-850f-02c4-ed6f90ee5340 (Addr: tcp/127.0.0.1:37012) (DC: dc1)
2019/12/06 06:41:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:41:32 [INFO]  raft: Node at 127.0.0.1:37006 [Leader] entering Leader state
TestCommand_File - 2019/12/06 06:41:32.281261 [INFO] consul: cluster leadership acquired
TestCommand_File - 2019/12/06 06:41:32.281751 [INFO] consul: New leader elected: Node c49ab86e-4a4b-9c53-3817-2744bffae8d8
2019/12/06 06:41:32 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:41:32 [INFO]  raft: Node at 127.0.0.1:37012 [Leader] entering Leader state
TestCommand_Flags - 2019/12/06 06:41:32.283439 [INFO] consul: cluster leadership acquired
TestCommand_Flags - 2019/12/06 06:41:32.283818 [INFO] consul: New leader elected: Node baf55f0a-0f12-850f-02c4-ed6f90ee5340
TestCommand_File - 2019/12/06 06:41:32.705934 [INFO] agent: Synced node info
TestCommand_Flags - 2019/12/06 06:41:32.866133 [INFO] agent: Synced service "web"
TestCommand_Flags - 2019/12/06 06:41:32.866207 [DEBUG] agent: Node info in sync
TestCommand_Flags - 2019/12/06 06:41:32.866319 [DEBUG] http: Request PUT /v1/agent/service/register (388.236772ms) from=127.0.0.1:54406
TestCommand_Flags - 2019/12/06 06:41:33.197202 [INFO] agent: Synced service "web"
TestCommand_Flags - 2019/12/06 06:41:33.197279 [DEBUG] agent: Node info in sync
TestCommand_File - 2019/12/06 06:41:33.198826 [INFO] agent: Synced service "web"
TestCommand_File - 2019/12/06 06:41:33.198892 [DEBUG] agent: Node info in sync
TestCommand_File - 2019/12/06 06:41:33.198966 [DEBUG] http: Request PUT /v1/agent/service/register (643.771432ms) from=127.0.0.1:34066
TestCommand_Flags - 2019/12/06 06:41:33.207263 [DEBUG] http: Request GET /v1/agent/services (337.698253ms) from=127.0.0.1:54410
TestCommand_Flags - 2019/12/06 06:41:33.212681 [INFO] agent: Requesting shutdown
TestCommand_Flags - 2019/12/06 06:41:33.212848 [INFO] consul: shutting down server
TestCommand_Flags - 2019/12/06 06:41:33.231119 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/12/06 06:41:33.212859 [DEBUG] http: Request GET /v1/agent/services (1.156693ms) from=127.0.0.1:34070
TestCommand_File - 2019/12/06 06:41:33.238050 [INFO] agent: Requesting shutdown
TestCommand_File - 2019/12/06 06:41:33.238146 [INFO] consul: shutting down server
TestCommand_File - 2019/12/06 06:41:33.238195 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/12/06 06:41:33.705426 [WARN] serf: Shutdown without a Leave
TestCommand_Flags - 2019/12/06 06:41:33.707477 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/12/06 06:41:33.854788 [INFO] manager: shutting down
TestCommand_Flags - 2019/12/06 06:41:33.855071 [INFO] manager: shutting down
TestCommand_File - 2019/12/06 06:41:33.855184 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestCommand_File - 2019/12/06 06:41:33.856056 [INFO] agent: consul server down
TestCommand_File - 2019/12/06 06:41:33.856120 [INFO] agent: shutdown complete
TestCommand_File - 2019/12/06 06:41:33.856178 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (tcp)
TestCommand_File - 2019/12/06 06:41:33.856315 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (udp)
TestCommand_File - 2019/12/06 06:41:33.856481 [INFO] agent: Stopping HTTP server 127.0.0.1:37002 (tcp)
TestCommand_File - 2019/12/06 06:41:33.857380 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File - 2019/12/06 06:41:33.857477 [INFO] agent: Endpoints down
--- PASS: TestCommand_File (3.94s)
TestCommand_Flags - 2019/12/06 06:41:33.938216 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestCommand_Flags - 2019/12/06 06:41:33.938503 [INFO] agent: consul server down
TestCommand_Flags - 2019/12/06 06:41:33.938558 [INFO] agent: shutdown complete
TestCommand_Flags - 2019/12/06 06:41:33.938612 [INFO] agent: Stopping DNS server 127.0.0.1:37007 (tcp)
TestCommand_Flags - 2019/12/06 06:41:33.938761 [INFO] agent: Stopping DNS server 127.0.0.1:37007 (udp)
TestCommand_Flags - 2019/12/06 06:41:33.938938 [INFO] agent: Stopping HTTP server 127.0.0.1:37008 (tcp)
TestCommand_Flags - 2019/12/06 06:41:33.939807 [INFO] agent: Waiting for endpoints to shut down
TestCommand_Flags - 2019/12/06 06:41:33.939913 [INFO] agent: Endpoints down
--- PASS: TestCommand_Flags (4.02s)
PASS
ok  	github.com/hashicorp/consul/command/services/register	4.598s
=== RUN   TestSnapshotCommand_noTabs
=== PAUSE TestSnapshotCommand_noTabs
=== CONT  TestSnapshotCommand_noTabs
--- PASS: TestSnapshotCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot	0.040s
=== RUN   TestSnapshotInspectCommand_noTabs
=== PAUSE TestSnapshotInspectCommand_noTabs
=== RUN   TestSnapshotInspectCommand_Validation
=== PAUSE TestSnapshotInspectCommand_Validation
=== RUN   TestSnapshotInspectCommand
=== PAUSE TestSnapshotInspectCommand
=== CONT  TestSnapshotInspectCommand_noTabs
--- PASS: TestSnapshotInspectCommand_noTabs (0.00s)
=== CONT  TestSnapshotInspectCommand
=== CONT  TestSnapshotInspectCommand_Validation
--- PASS: TestSnapshotInspectCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotInspectCommand - 2019/12/06 06:41:37.081163 [WARN] agent: Node name "Node 831a10ca-d404-f470-a9fc-6d02e7ab1a2c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotInspectCommand - 2019/12/06 06:41:37.082262 [DEBUG] tlsutil: Update with version 1
TestSnapshotInspectCommand - 2019/12/06 06:41:37.089729 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:41:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:831a10ca-d404-f470-a9fc-6d02e7ab1a2c Address:127.0.0.1:10006}]
TestSnapshotInspectCommand - 2019/12/06 06:41:37.925069 [INFO] serf: EventMemberJoin: Node 831a10ca-d404-f470-a9fc-6d02e7ab1a2c.dc1 127.0.0.1
2019/12/06 06:41:37 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestSnapshotInspectCommand - 2019/12/06 06:41:37.938979 [INFO] serf: EventMemberJoin: Node 831a10ca-d404-f470-a9fc-6d02e7ab1a2c 127.0.0.1
TestSnapshotInspectCommand - 2019/12/06 06:41:37.941241 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestSnapshotInspectCommand - 2019/12/06 06:41:37.941525 [INFO] consul: Handled member-join event for server "Node 831a10ca-d404-f470-a9fc-6d02e7ab1a2c.dc1" in area "wan"
TestSnapshotInspectCommand - 2019/12/06 06:41:37.951208 [INFO] consul: Adding LAN server Node 831a10ca-d404-f470-a9fc-6d02e7ab1a2c (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestSnapshotInspectCommand - 2019/12/06 06:41:37.952012 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestSnapshotInspectCommand - 2019/12/06 06:41:37.955093 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestSnapshotInspectCommand - 2019/12/06 06:41:37.955279 [INFO] agent: started state syncer
2019/12/06 06:41:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:41:37 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/12/06 06:41:38 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:41:38 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestSnapshotInspectCommand - 2019/12/06 06:41:38.471924 [INFO] consul: cluster leadership acquired
TestSnapshotInspectCommand - 2019/12/06 06:41:38.472578 [INFO] consul: New leader elected: Node 831a10ca-d404-f470-a9fc-6d02e7ab1a2c
TestSnapshotInspectCommand - 2019/12/06 06:41:38.806109 [INFO] agent: Synced node info
TestSnapshotInspectCommand - 2019/12/06 06:41:38.806261 [DEBUG] agent: Node info in sync
TestSnapshotInspectCommand - 2019/12/06 06:41:39.713971 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotInspectCommand - 2019/12/06 06:41:39.714746 [DEBUG] consul: Skipping self join check for "Node 831a10ca-d404-f470-a9fc-6d02e7ab1a2c" since the cluster is too small
TestSnapshotInspectCommand - 2019/12/06 06:41:39.714999 [INFO] consul: member 'Node 831a10ca-d404-f470-a9fc-6d02e7ab1a2c' joined, marking health alive
2019/12/06 06:41:39 [INFO] consul.fsm: snapshot created in 252.005µs
TestSnapshotInspectCommand - 2019/12/06 06:41:39.835524 [DEBUG] agent: Node info in sync
2019/12/06 06:41:39 [INFO]  raft: Starting snapshot up to 9
2019/12/06 06:41:39 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotInspectCommand-agent688414439/raft/snapshots/2-9-1575614499863.tmp
2019/12/06 06:41:40 [INFO]  raft: Snapshot to 9 complete
TestSnapshotInspectCommand - 2019/12/06 06:41:40.214928 [DEBUG] http: Request GET /v1/snapshot (1.351336359s) from=127.0.0.1:46768
TestSnapshotInspectCommand - 2019/12/06 06:41:40.223061 [INFO] agent: Requesting shutdown
TestSnapshotInspectCommand - 2019/12/06 06:41:40.223155 [INFO] consul: shutting down server
TestSnapshotInspectCommand - 2019/12/06 06:41:40.223202 [WARN] serf: Shutdown without a Leave
TestSnapshotInspectCommand - 2019/12/06 06:41:40.279810 [WARN] serf: Shutdown without a Leave
TestSnapshotInspectCommand - 2019/12/06 06:41:40.346509 [INFO] manager: shutting down
TestSnapshotInspectCommand - 2019/12/06 06:41:40.346927 [INFO] agent: consul server down
TestSnapshotInspectCommand - 2019/12/06 06:41:40.346977 [INFO] agent: shutdown complete
TestSnapshotInspectCommand - 2019/12/06 06:41:40.347031 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestSnapshotInspectCommand - 2019/12/06 06:41:40.347170 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestSnapshotInspectCommand - 2019/12/06 06:41:40.347341 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestSnapshotInspectCommand - 2019/12/06 06:41:40.348028 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotInspectCommand - 2019/12/06 06:41:40.348125 [INFO] agent: Endpoints down
--- PASS: TestSnapshotInspectCommand (3.34s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/inspect	3.671s
=== RUN   TestSnapshotRestoreCommand_noTabs
=== PAUSE TestSnapshotRestoreCommand_noTabs
=== RUN   TestSnapshotRestoreCommand_Validation
=== PAUSE TestSnapshotRestoreCommand_Validation
=== RUN   TestSnapshotRestoreCommand
=== PAUSE TestSnapshotRestoreCommand
=== CONT  TestSnapshotRestoreCommand_noTabs
=== CONT  TestSnapshotRestoreCommand_Validation
--- PASS: TestSnapshotRestoreCommand_Validation (0.00s)
=== CONT  TestSnapshotRestoreCommand
--- PASS: TestSnapshotRestoreCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotRestoreCommand - 2019/12/06 06:41:43.840110 [WARN] agent: Node name "Node fcb931c2-ebdb-33d0-7c07-cd453c96f1c1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotRestoreCommand - 2019/12/06 06:41:43.840846 [DEBUG] tlsutil: Update with version 1
TestSnapshotRestoreCommand - 2019/12/06 06:41:43.866646 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:41:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fcb931c2-ebdb-33d0-7c07-cd453c96f1c1 Address:127.0.0.1:35506}]
2019/12/06 06:41:44 [INFO]  raft: Node at 127.0.0.1:35506 [Follower] entering Follower state (Leader: "")
TestSnapshotRestoreCommand - 2019/12/06 06:41:44.627258 [INFO] serf: EventMemberJoin: Node fcb931c2-ebdb-33d0-7c07-cd453c96f1c1.dc1 127.0.0.1
TestSnapshotRestoreCommand - 2019/12/06 06:41:44.633502 [INFO] serf: EventMemberJoin: Node fcb931c2-ebdb-33d0-7c07-cd453c96f1c1 127.0.0.1
TestSnapshotRestoreCommand - 2019/12/06 06:41:44.635110 [INFO] consul: Adding LAN server Node fcb931c2-ebdb-33d0-7c07-cd453c96f1c1 (Addr: tcp/127.0.0.1:35506) (DC: dc1)
TestSnapshotRestoreCommand - 2019/12/06 06:41:44.638690 [INFO] consul: Handled member-join event for server "Node fcb931c2-ebdb-33d0-7c07-cd453c96f1c1.dc1" in area "wan"
TestSnapshotRestoreCommand - 2019/12/06 06:41:44.639691 [INFO] agent: Started DNS server 127.0.0.1:35501 (tcp)
TestSnapshotRestoreCommand - 2019/12/06 06:41:44.640121 [INFO] agent: Started DNS server 127.0.0.1:35501 (udp)
TestSnapshotRestoreCommand - 2019/12/06 06:41:44.643035 [INFO] agent: Started HTTP server on 127.0.0.1:35502 (tcp)
TestSnapshotRestoreCommand - 2019/12/06 06:41:44.643214 [INFO] agent: started state syncer
2019/12/06 06:41:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:41:44 [INFO]  raft: Node at 127.0.0.1:35506 [Candidate] entering Candidate state in term 2
2019/12/06 06:41:45 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:41:45 [INFO]  raft: Node at 127.0.0.1:35506 [Leader] entering Leader state
TestSnapshotRestoreCommand - 2019/12/06 06:41:45.106580 [INFO] consul: cluster leadership acquired
TestSnapshotRestoreCommand - 2019/12/06 06:41:45.107407 [INFO] consul: New leader elected: Node fcb931c2-ebdb-33d0-7c07-cd453c96f1c1
TestSnapshotRestoreCommand - 2019/12/06 06:41:45.389311 [INFO] agent: Synced node info
TestSnapshotRestoreCommand - 2019/12/06 06:41:45.389436 [DEBUG] agent: Node info in sync
TestSnapshotRestoreCommand - 2019/12/06 06:41:46.182588 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotRestoreCommand - 2019/12/06 06:41:46.183231 [DEBUG] consul: Skipping self join check for "Node fcb931c2-ebdb-33d0-7c07-cd453c96f1c1" since the cluster is too small
TestSnapshotRestoreCommand - 2019/12/06 06:41:46.183388 [INFO] consul: member 'Node fcb931c2-ebdb-33d0-7c07-cd453c96f1c1' joined, marking health alive
2019/12/06 06:41:46 [INFO] consul.fsm: snapshot created in 268.006µs
2019/12/06 06:41:46 [INFO]  raft: Starting snapshot up to 9
2019/12/06 06:41:46 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotRestoreCommand-agent452894237/raft/snapshots/2-9-1575614506342.tmp
2019/12/06 06:41:46 [INFO]  raft: Snapshot to 9 complete
TestSnapshotRestoreCommand - 2019/12/06 06:41:46.616179 [DEBUG] http: Request GET /v1/snapshot (1.421707677s) from=127.0.0.1:55780
2019/12/06 06:41:46 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotRestoreCommand-agent452894237/raft/snapshots/2-11-1575614506721.tmp
2019/12/06 06:41:46 [INFO]  raft: Copied 3506 bytes to local snapshot
2019/12/06 06:41:46 [INFO]  raft: Restored user snapshot (index 11)
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.307354 [DEBUG] http: Request PUT /v1/snapshot (682.776347ms) from=127.0.0.1:55782
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.309631 [INFO] agent: Requesting shutdown
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.309738 [INFO] consul: shutting down server
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.309785 [WARN] serf: Shutdown without a Leave
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.363223 [WARN] serf: Shutdown without a Leave
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.373225 [DEBUG] agent: Node info in sync
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.404961 [INFO] manager: shutting down
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.405445 [INFO] agent: consul server down
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.405508 [INFO] agent: shutdown complete
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.405564 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (tcp)
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.405714 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (udp)
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.405868 [INFO] agent: Stopping HTTP server 127.0.0.1:35502 (tcp)
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.406629 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotRestoreCommand - 2019/12/06 06:41:47.406862 [INFO] agent: Endpoints down
--- PASS: TestSnapshotRestoreCommand (3.70s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/restore	4.081s
=== RUN   TestSnapshotSaveCommand_noTabs
=== PAUSE TestSnapshotSaveCommand_noTabs
=== RUN   TestSnapshotSaveCommand_Validation
=== PAUSE TestSnapshotSaveCommand_Validation
=== RUN   TestSnapshotSaveCommand
=== PAUSE TestSnapshotSaveCommand
=== CONT  TestSnapshotSaveCommand_noTabs
=== CONT  TestSnapshotSaveCommand
--- PASS: TestSnapshotSaveCommand_noTabs (0.00s)
=== CONT  TestSnapshotSaveCommand_Validation
--- PASS: TestSnapshotSaveCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotSaveCommand - 2019/12/06 06:42:07.661832 [WARN] agent: Node name "Node b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotSaveCommand - 2019/12/06 06:42:07.663193 [DEBUG] tlsutil: Update with version 1
TestSnapshotSaveCommand - 2019/12/06 06:42:07.670509 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:42:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b Address:127.0.0.1:25006}]
TestSnapshotSaveCommand - 2019/12/06 06:42:08.403203 [INFO] serf: EventMemberJoin: Node b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b.dc1 127.0.0.1
2019/12/06 06:42:08 [INFO]  raft: Node at 127.0.0.1:25006 [Follower] entering Follower state (Leader: "")
TestSnapshotSaveCommand - 2019/12/06 06:42:08.419385 [INFO] serf: EventMemberJoin: Node b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b 127.0.0.1
TestSnapshotSaveCommand - 2019/12/06 06:42:08.421039 [INFO] consul: Adding LAN server Node b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b (Addr: tcp/127.0.0.1:25006) (DC: dc1)
TestSnapshotSaveCommand - 2019/12/06 06:42:08.421531 [INFO] consul: Handled member-join event for server "Node b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b.dc1" in area "wan"
TestSnapshotSaveCommand - 2019/12/06 06:42:08.425760 [INFO] agent: Started DNS server 127.0.0.1:25001 (tcp)
TestSnapshotSaveCommand - 2019/12/06 06:42:08.426052 [INFO] agent: Started DNS server 127.0.0.1:25001 (udp)
TestSnapshotSaveCommand - 2019/12/06 06:42:08.431123 [INFO] agent: Started HTTP server on 127.0.0.1:25002 (tcp)
TestSnapshotSaveCommand - 2019/12/06 06:42:08.431898 [INFO] agent: started state syncer
2019/12/06 06:42:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:42:08 [INFO]  raft: Node at 127.0.0.1:25006 [Candidate] entering Candidate state in term 2
2019/12/06 06:42:09 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:42:09 [INFO]  raft: Node at 127.0.0.1:25006 [Leader] entering Leader state
TestSnapshotSaveCommand - 2019/12/06 06:42:09.205775 [INFO] consul: cluster leadership acquired
TestSnapshotSaveCommand - 2019/12/06 06:42:09.206315 [INFO] consul: New leader elected: Node b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b
TestSnapshotSaveCommand - 2019/12/06 06:42:09.716943 [INFO] agent: Synced node info
TestSnapshotSaveCommand - 2019/12/06 06:42:09.717063 [DEBUG] agent: Node info in sync
TestSnapshotSaveCommand - 2019/12/06 06:42:10.622921 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotSaveCommand - 2019/12/06 06:42:10.623430 [DEBUG] consul: Skipping self join check for "Node b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b" since the cluster is too small
TestSnapshotSaveCommand - 2019/12/06 06:42:10.623595 [INFO] consul: member 'Node b6039ae8-e4c4-fd72-b39f-0d7b98c5c33b' joined, marking health alive
2019/12/06 06:42:10 [INFO] consul.fsm: snapshot created in 212.338µs
2019/12/06 06:42:10 [INFO]  raft: Starting snapshot up to 10
2019/12/06 06:42:10 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotSaveCommand-agent828574018/raft/snapshots/2-10-1575614530959.tmp
2019/12/06 06:42:11 [INFO]  raft: Snapshot to 10 complete
TestSnapshotSaveCommand - 2019/12/06 06:42:11.299632 [DEBUG] http: Request GET /v1/snapshot (2.062406703s) from=127.0.0.1:59216
2019/12/06 06:42:11 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotSaveCommand-agent828574018/raft/snapshots/2-11-1575614531388.tmp
2019/12/06 06:42:11 [INFO]  raft: Copied 4832 bytes to local snapshot
2019/12/06 06:42:11 [INFO]  raft: Restored user snapshot (index 11)
TestSnapshotSaveCommand - 2019/12/06 06:42:12.022791 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestSnapshotSaveCommand - 2019/12/06 06:42:12.029255 [DEBUG] http: Request PUT /v1/snapshot (719.777547ms) from=127.0.0.1:59218
TestSnapshotSaveCommand - 2019/12/06 06:42:12.031760 [INFO] agent: Requesting shutdown
TestSnapshotSaveCommand - 2019/12/06 06:42:12.031872 [INFO] consul: shutting down server
TestSnapshotSaveCommand - 2019/12/06 06:42:12.031947 [WARN] serf: Shutdown without a Leave
TestSnapshotSaveCommand - 2019/12/06 06:42:12.080347 [WARN] serf: Shutdown without a Leave
TestSnapshotSaveCommand - 2019/12/06 06:42:12.122082 [INFO] manager: shutting down
TestSnapshotSaveCommand - 2019/12/06 06:42:12.122973 [INFO] agent: consul server down
TestSnapshotSaveCommand - 2019/12/06 06:42:12.123043 [INFO] agent: shutdown complete
TestSnapshotSaveCommand - 2019/12/06 06:42:12.123146 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (tcp)
TestSnapshotSaveCommand - 2019/12/06 06:42:12.123309 [INFO] agent: Stopping DNS server 127.0.0.1:25001 (udp)
TestSnapshotSaveCommand - 2019/12/06 06:42:12.123460 [INFO] agent: Stopping HTTP server 127.0.0.1:25002 (tcp)
TestSnapshotSaveCommand - 2019/12/06 06:42:12.124139 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotSaveCommand - 2019/12/06 06:42:12.124315 [INFO] agent: Endpoints down
--- PASS: TestSnapshotSaveCommand (4.54s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/save	4.840s
=== RUN   TestValidateCommand_noTabs
=== PAUSE TestValidateCommand_noTabs
=== RUN   TestValidateCommand_FailOnEmptyFile
=== PAUSE TestValidateCommand_FailOnEmptyFile
=== RUN   TestValidateCommand_SucceedOnMinimalConfigFile
=== PAUSE TestValidateCommand_SucceedOnMinimalConfigFile
=== RUN   TestValidateCommand_SucceedWithMinimalJSONConfigFormat
=== PAUSE TestValidateCommand_SucceedWithMinimalJSONConfigFormat
=== RUN   TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== PAUSE TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== RUN   TestValidateCommand_SucceedWithJSONAsHCL
=== PAUSE TestValidateCommand_SucceedWithJSONAsHCL
=== RUN   TestValidateCommand_SucceedOnMinimalConfigDir
=== PAUSE TestValidateCommand_SucceedOnMinimalConfigDir
=== RUN   TestValidateCommand_FailForInvalidJSONConfigFormat
=== PAUSE TestValidateCommand_FailForInvalidJSONConfigFormat
=== RUN   TestValidateCommand_Quiet
=== PAUSE TestValidateCommand_Quiet
=== CONT  TestValidateCommand_noTabs
--- PASS: TestValidateCommand_noTabs (0.00s)
=== CONT  TestValidateCommand_Quiet
=== CONT  TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== CONT  TestValidateCommand_SucceedOnMinimalConfigFile
=== CONT  TestValidateCommand_FailForInvalidJSONConfigFormat
--- PASS: TestValidateCommand_FailForInvalidJSONConfigFormat (0.09s)
=== CONT  TestValidateCommand_SucceedWithMinimalJSONConfigFormat
--- PASS: TestValidateCommand_SucceedWithMinimalHCLConfigFormat (0.11s)
=== CONT  TestValidateCommand_FailOnEmptyFile
--- PASS: TestValidateCommand_Quiet (0.17s)
=== CONT  TestValidateCommand_SucceedOnMinimalConfigDir
--- PASS: TestValidateCommand_FailOnEmptyFile (0.05s)
--- PASS: TestValidateCommand_SucceedOnMinimalConfigFile (0.16s)
=== CONT  TestValidateCommand_SucceedWithJSONAsHCL
--- PASS: TestValidateCommand_SucceedWithMinimalJSONConfigFormat (0.09s)
--- PASS: TestValidateCommand_SucceedOnMinimalConfigDir (0.08s)
--- PASS: TestValidateCommand_SucceedWithJSONAsHCL (0.08s)
PASS
ok  	github.com/hashicorp/consul/command/validate	0.423s
=== RUN   TestVersionCommand_noTabs
=== PAUSE TestVersionCommand_noTabs
=== CONT  TestVersionCommand_noTabs
--- PASS: TestVersionCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/version	0.170s
=== RUN   TestWatchCommand_noTabs
=== PAUSE TestWatchCommand_noTabs
=== RUN   TestWatchCommand
=== PAUSE TestWatchCommand
=== RUN   TestWatchCommand_loadToken
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommand_loadToken - 2019/12/06 06:42:34.117348 [WARN] agent: Node name "Node 4212a362-d66f-5abd-e5d3-367f19ae5008" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommand_loadToken - 2019/12/06 06:42:34.118665 [DEBUG] tlsutil: Update with version 1
TestWatchCommand_loadToken - 2019/12/06 06:42:34.130796 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:42:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4212a362-d66f-5abd-e5d3-367f19ae5008 Address:127.0.0.1:52006}]
TestWatchCommand_loadToken - 2019/12/06 06:42:34.911459 [INFO] serf: EventMemberJoin: Node 4212a362-d66f-5abd-e5d3-367f19ae5008.dc1 127.0.0.1
2019/12/06 06:42:34 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestWatchCommand_loadToken - 2019/12/06 06:42:34.918401 [INFO] serf: EventMemberJoin: Node 4212a362-d66f-5abd-e5d3-367f19ae5008 127.0.0.1
TestWatchCommand_loadToken - 2019/12/06 06:42:34.921559 [INFO] consul: Adding LAN server Node 4212a362-d66f-5abd-e5d3-367f19ae5008 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestWatchCommand_loadToken - 2019/12/06 06:42:34.922620 [INFO] consul: Handled member-join event for server "Node 4212a362-d66f-5abd-e5d3-367f19ae5008.dc1" in area "wan"
TestWatchCommand_loadToken - 2019/12/06 06:42:34.927720 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestWatchCommand_loadToken - 2019/12/06 06:42:34.927954 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestWatchCommand_loadToken - 2019/12/06 06:42:34.931349 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestWatchCommand_loadToken - 2019/12/06 06:42:34.931570 [INFO] agent: started state syncer
2019/12/06 06:42:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:42:34 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/12/06 06:42:35 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:42:35 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestWatchCommand_loadToken - 2019/12/06 06:42:35.423119 [INFO] consul: cluster leadership acquired
TestWatchCommand_loadToken - 2019/12/06 06:42:35.423705 [INFO] consul: New leader elected: Node 4212a362-d66f-5abd-e5d3-367f19ae5008
TestWatchCommand_loadToken - 2019/12/06 06:42:35.708615 [INFO] agent: Synced node info
TestWatchCommand_loadToken - 2019/12/06 06:42:36.434833 [DEBUG] agent: Node info in sync
TestWatchCommand_loadToken - 2019/12/06 06:42:36.434950 [DEBUG] agent: Node info in sync
TestWatchCommand_loadToken - 2019/12/06 06:42:36.498059 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestWatchCommand_loadToken - 2019/12/06 06:42:36.498527 [DEBUG] consul: Skipping self join check for "Node 4212a362-d66f-5abd-e5d3-367f19ae5008" since the cluster is too small
TestWatchCommand_loadToken - 2019/12/06 06:42:36.498760 [INFO] consul: member 'Node 4212a362-d66f-5abd-e5d3-367f19ae5008' joined, marking health alive
=== RUN   TestWatchCommand_loadToken/token_arg
=== RUN   TestWatchCommand_loadToken/token_env
=== RUN   TestWatchCommand_loadToken/token_file_arg
=== RUN   TestWatchCommand_loadToken/token_file_env
TestWatchCommand_loadToken - 2019/12/06 06:42:36.690736 [INFO] agent: Requesting shutdown
TestWatchCommand_loadToken - 2019/12/06 06:42:36.690844 [INFO] consul: shutting down server
TestWatchCommand_loadToken - 2019/12/06 06:42:36.690901 [WARN] serf: Shutdown without a Leave
TestWatchCommand_loadToken - 2019/12/06 06:42:36.739049 [WARN] serf: Shutdown without a Leave
TestWatchCommand_loadToken - 2019/12/06 06:42:36.780723 [INFO] manager: shutting down
TestWatchCommand_loadToken - 2019/12/06 06:42:36.781120 [INFO] agent: consul server down
TestWatchCommand_loadToken - 2019/12/06 06:42:36.781171 [INFO] agent: shutdown complete
TestWatchCommand_loadToken - 2019/12/06 06:42:36.781229 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestWatchCommand_loadToken - 2019/12/06 06:42:36.781361 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestWatchCommand_loadToken - 2019/12/06 06:42:36.781525 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestWatchCommand_loadToken - 2019/12/06 06:42:36.781743 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommand_loadToken - 2019/12/06 06:42:36.781801 [INFO] agent: Endpoints down
--- PASS: TestWatchCommand_loadToken (2.78s)
    --- PASS: TestWatchCommand_loadToken/token_arg (0.00s)
    --- PASS: TestWatchCommand_loadToken/token_env (0.02s)
    --- PASS: TestWatchCommand_loadToken/token_file_arg (0.01s)
    --- PASS: TestWatchCommand_loadToken/token_file_env (0.00s)
=== RUN   TestWatchCommandNoConnect
=== PAUSE TestWatchCommandNoConnect
=== RUN   TestWatchCommandNoAgentService
=== PAUSE TestWatchCommandNoAgentService
=== CONT  TestWatchCommand
=== CONT  TestWatchCommandNoAgentService
=== CONT  TestWatchCommand_noTabs
=== CONT  TestWatchCommandNoConnect
--- PASS: TestWatchCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommandNoConnect - 2019/12/06 06:42:36.896740 [WARN] agent: Node name "Node 83e30bd5-feaf-e08e-664d-676236d3a750" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommandNoConnect - 2019/12/06 06:42:36.897230 [DEBUG] tlsutil: Update with version 1
TestWatchCommandNoConnect - 2019/12/06 06:42:36.899521 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommand - 2019/12/06 06:42:36.984064 [WARN] agent: Node name "Node 4e0c8313-f314-c08e-5bd3-593917254c80" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommand - 2019/12/06 06:42:36.984565 [DEBUG] tlsutil: Update with version 1
TestWatchCommandNoAgentService - 2019/12/06 06:42:36.985494 [WARN] agent: Node name "Node bd12f1e1-2c3b-cdf1-b252-ac98223fdf75" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommand - 2019/12/06 06:42:36.986902 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestWatchCommandNoAgentService - 2019/12/06 06:42:36.987607 [DEBUG] tlsutil: Update with version 1
TestWatchCommandNoAgentService - 2019/12/06 06:42:36.995603 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:42:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:83e30bd5-feaf-e08e-664d-676236d3a750 Address:127.0.0.1:52024}]
2019/12/06 06:42:37 [INFO]  raft: Node at 127.0.0.1:52024 [Follower] entering Follower state (Leader: "")
TestWatchCommandNoConnect - 2019/12/06 06:42:37.902382 [INFO] serf: EventMemberJoin: Node 83e30bd5-feaf-e08e-664d-676236d3a750.dc1 127.0.0.1
TestWatchCommandNoConnect - 2019/12/06 06:42:37.908621 [INFO] serf: EventMemberJoin: Node 83e30bd5-feaf-e08e-664d-676236d3a750 127.0.0.1
TestWatchCommandNoConnect - 2019/12/06 06:42:37.909915 [INFO] consul: Adding LAN server Node 83e30bd5-feaf-e08e-664d-676236d3a750 (Addr: tcp/127.0.0.1:52024) (DC: dc1)
TestWatchCommandNoConnect - 2019/12/06 06:42:37.910773 [INFO] consul: Handled member-join event for server "Node 83e30bd5-feaf-e08e-664d-676236d3a750.dc1" in area "wan"
TestWatchCommandNoConnect - 2019/12/06 06:42:37.913594 [INFO] agent: Started DNS server 127.0.0.1:52019 (tcp)
TestWatchCommandNoConnect - 2019/12/06 06:42:37.913826 [INFO] agent: Started DNS server 127.0.0.1:52019 (udp)
TestWatchCommandNoConnect - 2019/12/06 06:42:37.924739 [INFO] agent: Started HTTP server on 127.0.0.1:52020 (tcp)
TestWatchCommandNoConnect - 2019/12/06 06:42:37.926980 [INFO] agent: started state syncer
2019/12/06 06:42:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:42:37 [INFO]  raft: Node at 127.0.0.1:52024 [Candidate] entering Candidate state in term 2
2019/12/06 06:42:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bd12f1e1-2c3b-cdf1-b252-ac98223fdf75 Address:127.0.0.1:52018}]
2019/12/06 06:42:38 [INFO]  raft: Node at 127.0.0.1:52018 [Follower] entering Follower state (Leader: "")
2019/12/06 06:42:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:42:38 [INFO]  raft: Node at 127.0.0.1:52018 [Candidate] entering Candidate state in term 2
2019/12/06 06:42:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4e0c8313-f314-c08e-5bd3-593917254c80 Address:127.0.0.1:52012}]
2019/12/06 06:42:38 [INFO]  raft: Node at 127.0.0.1:52012 [Follower] entering Follower state (Leader: "")
TestWatchCommand - 2019/12/06 06:42:38.383982 [INFO] serf: EventMemberJoin: Node 4e0c8313-f314-c08e-5bd3-593917254c80.dc1 127.0.0.1
2019/12/06 06:42:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:42:38 [INFO]  raft: Node at 127.0.0.1:52012 [Candidate] entering Candidate state in term 2
TestWatchCommand - 2019/12/06 06:42:38.405412 [INFO] serf: EventMemberJoin: Node 4e0c8313-f314-c08e-5bd3-593917254c80 127.0.0.1
TestWatchCommand - 2019/12/06 06:42:38.409044 [INFO] agent: Started DNS server 127.0.0.1:52007 (udp)
TestWatchCommand - 2019/12/06 06:42:38.415292 [INFO] agent: Started DNS server 127.0.0.1:52007 (tcp)
TestWatchCommand - 2019/12/06 06:42:38.412580 [INFO] consul: Handled member-join event for server "Node 4e0c8313-f314-c08e-5bd3-593917254c80.dc1" in area "wan"
TestWatchCommand - 2019/12/06 06:42:38.413850 [INFO] consul: Adding LAN server Node 4e0c8313-f314-c08e-5bd3-593917254c80 (Addr: tcp/127.0.0.1:52012) (DC: dc1)
TestWatchCommand - 2019/12/06 06:42:38.420756 [INFO] agent: Started HTTP server on 127.0.0.1:52008 (tcp)
TestWatchCommand - 2019/12/06 06:42:38.421997 [INFO] agent: started state syncer
TestWatchCommandNoAgentService - 2019/12/06 06:42:38.434096 [INFO] serf: EventMemberJoin: Node bd12f1e1-2c3b-cdf1-b252-ac98223fdf75.dc1 127.0.0.1
TestWatchCommandNoAgentService - 2019/12/06 06:42:38.457539 [INFO] serf: EventMemberJoin: Node bd12f1e1-2c3b-cdf1-b252-ac98223fdf75 127.0.0.1
TestWatchCommandNoAgentService - 2019/12/06 06:42:38.458382 [INFO] consul: Adding LAN server Node bd12f1e1-2c3b-cdf1-b252-ac98223fdf75 (Addr: tcp/127.0.0.1:52018) (DC: dc1)
TestWatchCommandNoAgentService - 2019/12/06 06:42:38.458441 [INFO] consul: Handled member-join event for server "Node bd12f1e1-2c3b-cdf1-b252-ac98223fdf75.dc1" in area "wan"
TestWatchCommandNoAgentService - 2019/12/06 06:42:38.459246 [INFO] agent: Started DNS server 127.0.0.1:52013 (tcp)
TestWatchCommandNoAgentService - 2019/12/06 06:42:38.459327 [INFO] agent: Started DNS server 127.0.0.1:52013 (udp)
TestWatchCommandNoAgentService - 2019/12/06 06:42:38.461536 [INFO] agent: Started HTTP server on 127.0.0.1:52014 (tcp)
TestWatchCommandNoAgentService - 2019/12/06 06:42:38.461827 [INFO] agent: started state syncer
2019/12/06 06:42:38 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:42:38 [INFO]  raft: Node at 127.0.0.1:52024 [Leader] entering Leader state
TestWatchCommandNoConnect - 2019/12/06 06:42:38.749059 [INFO] consul: cluster leadership acquired
TestWatchCommandNoConnect - 2019/12/06 06:42:38.749558 [INFO] consul: New leader elected: Node 83e30bd5-feaf-e08e-664d-676236d3a750
2019/12/06 06:42:39 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:42:39 [INFO]  raft: Node at 127.0.0.1:52018 [Leader] entering Leader state
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.144884 [INFO] consul: cluster leadership acquired
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.145424 [INFO] consul: New leader elected: Node bd12f1e1-2c3b-cdf1-b252-ac98223fdf75
2019/12/06 06:42:39 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:42:39 [INFO]  raft: Node at 127.0.0.1:52012 [Leader] entering Leader state
TestWatchCommand - 2019/12/06 06:42:39.146646 [INFO] consul: cluster leadership acquired
TestWatchCommand - 2019/12/06 06:42:39.147049 [INFO] consul: New leader elected: Node 4e0c8313-f314-c08e-5bd3-593917254c80
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.252443 [INFO] agent: Requesting shutdown
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.252579 [INFO] consul: shutting down server
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.252635 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.253100 [ERR] agent: failed to sync remote state: No cluster leader
TestWatchCommandNoConnect - 2019/12/06 06:42:39.281775 [INFO] agent: Synced node info
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.372437 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.447408 [INFO] manager: shutting down
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.447588 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.450630 [INFO] agent: consul server down
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.450708 [INFO] agent: shutdown complete
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.450769 [INFO] agent: Stopping DNS server 127.0.0.1:52013 (tcp)
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.450916 [INFO] agent: Stopping DNS server 127.0.0.1:52013 (udp)
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.451082 [INFO] agent: Stopping HTTP server 127.0.0.1:52014 (tcp)
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.451313 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommandNoAgentService - 2019/12/06 06:42:39.451387 [INFO] agent: Endpoints down
--- PASS: TestWatchCommandNoAgentService (2.67s)
TestWatchCommand - 2019/12/06 06:42:40.565069 [INFO] agent: Synced node info
TestWatchCommand - 2019/12/06 06:42:40.565203 [DEBUG] agent: Node info in sync
TestWatchCommandNoConnect - 2019/12/06 06:42:40.914839 [DEBUG] agent: Node info in sync
TestWatchCommandNoConnect - 2019/12/06 06:42:40.914975 [DEBUG] agent: Node info in sync
TestWatchCommandNoConnect - 2019/12/06 06:42:41.489933 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestWatchCommandNoConnect - 2019/12/06 06:42:41.490422 [DEBUG] consul: Skipping self join check for "Node 83e30bd5-feaf-e08e-664d-676236d3a750" since the cluster is too small
TestWatchCommandNoConnect - 2019/12/06 06:42:41.490592 [INFO] consul: member 'Node 83e30bd5-feaf-e08e-664d-676236d3a750' joined, marking health alive
TestWatchCommandNoConnect - 2019/12/06 06:42:41.764651 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestWatchCommandNoConnect - 2019/12/06 06:42:41.779906 [INFO] agent: Requesting shutdown
TestWatchCommandNoConnect - 2019/12/06 06:42:41.780377 [INFO] consul: shutting down server
TestWatchCommandNoConnect - 2019/12/06 06:42:41.780808 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoConnect - 2019/12/06 06:42:41.880741 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoConnect - 2019/12/06 06:42:41.947501 [INFO] manager: shutting down
TestWatchCommandNoConnect - 2019/12/06 06:42:41.948338 [INFO] agent: consul server down
TestWatchCommandNoConnect - 2019/12/06 06:42:41.948405 [INFO] agent: shutdown complete
TestWatchCommandNoConnect - 2019/12/06 06:42:41.948469 [INFO] agent: Stopping DNS server 127.0.0.1:52019 (tcp)
TestWatchCommandNoConnect - 2019/12/06 06:42:41.948609 [INFO] agent: Stopping DNS server 127.0.0.1:52019 (udp)
TestWatchCommandNoConnect - 2019/12/06 06:42:41.948776 [INFO] agent: Stopping HTTP server 127.0.0.1:52020 (tcp)
TestWatchCommandNoConnect - 2019/12/06 06:42:41.949034 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommandNoConnect - 2019/12/06 06:42:41.949124 [INFO] agent: Endpoints down
--- PASS: TestWatchCommandNoConnect (5.16s)
TestWatchCommand - 2019/12/06 06:42:42.215155 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestWatchCommand - 2019/12/06 06:42:42.215691 [DEBUG] consul: Skipping self join check for "Node 4e0c8313-f314-c08e-5bd3-593917254c80" since the cluster is too small
TestWatchCommand - 2019/12/06 06:42:42.215886 [INFO] consul: member 'Node 4e0c8313-f314-c08e-5bd3-593917254c80' joined, marking health alive
TestWatchCommand - 2019/12/06 06:42:42.419380 [DEBUG] http: Request GET /v1/agent/self (17.865086ms) from=127.0.0.1:53102
TestWatchCommand - 2019/12/06 06:42:42.452907 [DEBUG] http: Request GET /v1/catalog/nodes (1.631039ms) from=127.0.0.1:53104
TestWatchCommand - 2019/12/06 06:42:42.455435 [INFO] agent: Requesting shutdown
TestWatchCommand - 2019/12/06 06:42:42.455556 [INFO] consul: shutting down server
TestWatchCommand - 2019/12/06 06:42:42.455622 [WARN] serf: Shutdown without a Leave
TestWatchCommand - 2019/12/06 06:42:42.514116 [WARN] serf: Shutdown without a Leave
TestWatchCommand - 2019/12/06 06:42:42.564324 [INFO] manager: shutting down
TestWatchCommand - 2019/12/06 06:42:42.564774 [INFO] agent: consul server down
TestWatchCommand - 2019/12/06 06:42:42.564831 [INFO] agent: shutdown complete
TestWatchCommand - 2019/12/06 06:42:42.564889 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (tcp)
TestWatchCommand - 2019/12/06 06:42:42.565036 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (udp)
TestWatchCommand - 2019/12/06 06:42:42.565207 [INFO] agent: Stopping HTTP server 127.0.0.1:52008 (tcp)
TestWatchCommand - 2019/12/06 06:42:42.565977 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommand - 2019/12/06 06:42:42.566069 [INFO] agent: Endpoints down
--- PASS: TestWatchCommand (5.78s)
PASS
ok  	github.com/hashicorp/consul/command/watch	8.875s
=== RUN   TestStaticResolver_Resolve
=== RUN   TestStaticResolver_Resolve/simples
--- PASS: TestStaticResolver_Resolve (0.00s)
    --- PASS: TestStaticResolver_Resolve/simples (0.00s)
=== RUN   TestConsulResolver_Resolve
WARNING: bootstrap = true: do not enable unless necessary
test-consul - 2019/12/06 06:42:51.594460 [WARN] agent: Node name "Node 1002d028-4730-b8d7-f698-78361e82028f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test-consul - 2019/12/06 06:42:51.595271 [DEBUG] tlsutil: Update with version 1
test-consul - 2019/12/06 06:42:51.601566 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:42:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1002d028-4730-b8d7-f698-78361e82028f Address:127.0.0.1:49006}]
2019/12/06 06:42:52 [INFO]  raft: Node at 127.0.0.1:49006 [Follower] entering Follower state (Leader: "")
test-consul - 2019/12/06 06:42:52.736124 [INFO] serf: EventMemberJoin: Node 1002d028-4730-b8d7-f698-78361e82028f.dc1 127.0.0.1
test-consul - 2019/12/06 06:42:52.740819 [INFO] serf: EventMemberJoin: Node 1002d028-4730-b8d7-f698-78361e82028f 127.0.0.1
test-consul - 2019/12/06 06:42:52.743604 [INFO] agent: Started DNS server 127.0.0.1:49001 (udp)
test-consul - 2019/12/06 06:42:52.743848 [INFO] consul: Handled member-join event for server "Node 1002d028-4730-b8d7-f698-78361e82028f.dc1" in area "wan"
test-consul - 2019/12/06 06:42:52.744337 [INFO] agent: Started DNS server 127.0.0.1:49001 (tcp)
test-consul - 2019/12/06 06:42:52.746606 [INFO] agent: Started HTTP server on 127.0.0.1:49002 (tcp)
test-consul - 2019/12/06 06:42:52.746726 [INFO] agent: started state syncer
test-consul - 2019/12/06 06:42:52.747814 [INFO] consul: Adding LAN server Node 1002d028-4730-b8d7-f698-78361e82028f (Addr: tcp/127.0.0.1:49006) (DC: dc1)
2019/12/06 06:42:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:42:52 [INFO]  raft: Node at 127.0.0.1:49006 [Candidate] entering Candidate state in term 2
2019/12/06 06:42:53 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:42:53 [INFO]  raft: Node at 127.0.0.1:49006 [Leader] entering Leader state
test-consul - 2019/12/06 06:42:53.247953 [INFO] consul: cluster leadership acquired
test-consul - 2019/12/06 06:42:53.248488 [INFO] consul: New leader elected: Node 1002d028-4730-b8d7-f698-78361e82028f
test-consul - 2019/12/06 06:42:53.590263 [INFO] agent: Synced node info
test-consul - 2019/12/06 06:42:53.927893 [INFO] agent: Synced service "web"
test-consul - 2019/12/06 06:42:53.927976 [DEBUG] agent: Node info in sync
test-consul - 2019/12/06 06:42:53.928069 [DEBUG] http: Request PUT /v1/agent/service/register (530.001763ms) from=127.0.0.1:54628
test-consul - 2019/12/06 06:42:53.953363 [ERR] leaf watch error: invalid type for leaf response: <nil>
test-consul - 2019/12/06 06:42:54.049385 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/06 06:42:54.449411 [INFO] agent: Synced service "web-proxy"
test-consul - 2019/12/06 06:42:54.449489 [DEBUG] agent: Node info in sync
test-consul - 2019/12/06 06:42:54.449558 [DEBUG] http: Request PUT /v1/agent/service/register (507.095893ms) from=127.0.0.1:54628
test-consul - 2019/12/06 06:42:54.452977 [ERR] leaf watch error: invalid type for leaf response: <nil>
test-consul - 2019/12/06 06:42:54.566290 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/06 06:42:54.566364 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/12/06 06:42:55.124415 [INFO] connect: initialized primary datacenter CA with provider "consul"
test-consul - 2019/12/06 06:42:55.124955 [DEBUG] consul: Skipping self join check for "Node 1002d028-4730-b8d7-f698-78361e82028f" since the cluster is too small
test-consul - 2019/12/06 06:42:55.125109 [INFO] consul: member 'Node 1002d028-4730-b8d7-f698-78361e82028f' joined, marking health alive
test-consul - 2019/12/06 06:42:55.126294 [INFO] agent: Synced service "web-proxy-2"
test-consul - 2019/12/06 06:42:55.126343 [DEBUG] agent: Node info in sync
test-consul - 2019/12/06 06:42:55.126403 [DEBUG] http: Request PUT /v1/agent/service/register (675.420507ms) from=127.0.0.1:54628
test-consul - 2019/12/06 06:42:55.199225 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/06 06:42:55.199322 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/12/06 06:42:55.199366 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/12/06 06:42:55.419817 [INFO] agent: Synced service "db"
test-consul - 2019/12/06 06:42:55.419891 [DEBUG] agent: Node info in sync
test-consul - 2019/12/06 06:42:55.419956 [DEBUG] http: Request PUT /v1/agent/service/register (289.660127ms) from=127.0.0.1:54628
test-consul - 2019/12/06 06:42:55.591976 [DEBUG] http: Request POST /v1/query (169.376639ms) from=127.0.0.1:54628
=== RUN   TestConsulResolver_Resolve/basic_service_discovery
test-consul - 2019/12/06 06:42:55.598660 [DEBUG] http: Request GET /v1/health/connect/web?connect=true&passing=1&stale= (3.374079ms) from=127.0.0.1:54628
=== RUN   TestConsulResolver_Resolve/basic_service_with_native_service
test-consul - 2019/12/06 06:42:55.606106 [DEBUG] http: Request GET /v1/health/connect/db?connect=true&passing=1&stale= (1.236029ms) from=127.0.0.1:54628
=== RUN   TestConsulResolver_Resolve/Bad_Type_errors
=== RUN   TestConsulResolver_Resolve/Non-existent_service_errors
test-consul - 2019/12/06 06:42:55.632187 [DEBUG] http: Request GET /v1/health/connect/foo?connect=true&passing=1&stale= (21.340834ms) from=127.0.0.1:54628
=== RUN   TestConsulResolver_Resolve/timeout_errors
=== RUN   TestConsulResolver_Resolve/prepared_query_by_id
test-consul - 2019/12/06 06:42:55.640110 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
test-consul - 2019/12/06 06:42:55.640367 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/12/06 06:42:55.640505 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/12/06 06:42:55.640622 [DEBUG] agent: Service "db" in sync
test-consul - 2019/12/06 06:42:55.640779 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/06 06:42:55.640899 [DEBUG] agent: Node info in sync
test-consul - 2019/12/06 06:42:55.641073 [DEBUG] agent: Service "web" in sync
test-consul - 2019/12/06 06:42:55.641215 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/12/06 06:42:55.641346 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/12/06 06:42:55.641473 [DEBUG] agent: Service "db" in sync
test-consul - 2019/12/06 06:42:55.641589 [DEBUG] agent: Node info in sync
test-consul - 2019/12/06 06:42:55.645229 [DEBUG] http: Request GET /v1/query/f93e2edf-83c1-a537-4b34-2a3740f0d28b/execute?connect=true&stale= (9.25155ms) from=127.0.0.1:54628
=== RUN   TestConsulResolver_Resolve/prepared_query_by_name
test-consul - 2019/12/06 06:42:55.656459 [DEBUG] http: Request GET /v1/query/test-query/execute?connect=true&stale= (3.008404ms) from=127.0.0.1:54628
test-consul - 2019/12/06 06:42:55.661861 [INFO] agent: Requesting shutdown
test-consul - 2019/12/06 06:42:55.661985 [INFO] consul: shutting down server
test-consul - 2019/12/06 06:42:55.662034 [WARN] serf: Shutdown without a Leave
test-consul - 2019/12/06 06:42:55.746115 [WARN] serf: Shutdown without a Leave
test-consul - 2019/12/06 06:42:55.814549 [INFO] manager: shutting down
test-consul - 2019/12/06 06:42:55.815083 [INFO] agent: consul server down
test-consul - 2019/12/06 06:42:55.815140 [INFO] agent: shutdown complete
test-consul - 2019/12/06 06:42:55.815208 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (tcp)
test-consul - 2019/12/06 06:42:55.815340 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (udp)
test-consul - 2019/12/06 06:42:55.815487 [INFO] agent: Stopping HTTP server 127.0.0.1:49002 (tcp)
test-consul - 2019/12/06 06:42:55.816036 [INFO] agent: Waiting for endpoints to shut down
test-consul - 2019/12/06 06:42:55.816105 [INFO] agent: Endpoints down
--- PASS: TestConsulResolver_Resolve (4.28s)
    --- PASS: TestConsulResolver_Resolve/basic_service_discovery (0.01s)
    --- PASS: TestConsulResolver_Resolve/basic_service_with_native_service (0.01s)
    --- PASS: TestConsulResolver_Resolve/Bad_Type_errors (0.00s)
    --- PASS: TestConsulResolver_Resolve/Non-existent_service_errors (0.02s)
    --- PASS: TestConsulResolver_Resolve/timeout_errors (0.00s)
    --- PASS: TestConsulResolver_Resolve/prepared_query_by_id (0.02s)
    --- PASS: TestConsulResolver_Resolve/prepared_query_by_name (0.01s)
=== RUN   TestConsulResolverFromAddrFunc
=== RUN   TestConsulResolverFromAddrFunc/service
=== RUN   TestConsulResolverFromAddrFunc/query
=== RUN   TestConsulResolverFromAddrFunc/service_with_dc
=== RUN   TestConsulResolverFromAddrFunc/query_with_dc
=== RUN   TestConsulResolverFromAddrFunc/invalid_host:port
=== RUN   TestConsulResolverFromAddrFunc/custom_domain
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter#01
=== RUN   TestConsulResolverFromAddrFunc/unsupported_tag_filter
=== RUN   TestConsulResolverFromAddrFunc/unsupported_tag_filter_with_DC
--- PASS: TestConsulResolverFromAddrFunc (0.01s)
    --- PASS: TestConsulResolverFromAddrFunc/service (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/query (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/service_with_dc (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/query_with_dc (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/invalid_host:port (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/custom_domain (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter#01 (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_tag_filter (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_tag_filter_with_DC (0.00s)
=== RUN   TestService_Name
--- PASS: TestService_Name (0.09s)
=== RUN   TestService_Dial
--- SKIP: TestService_Dial (0.00s)
    service_test.go:36: DM-skipped
=== RUN   TestService_ServerTLSConfig
--- SKIP: TestService_ServerTLSConfig (0.00s)
    service_test.go:129: DM-skipped
=== RUN   TestService_HTTPClient
2019/12/06 06:42:56 starting test connect HTTPS server on 127.0.0.1:49007
2019/12/06 06:42:56 test connect service listening on 127.0.0.1:49007
2019/12/06 06:42:56 [DEBUG] resolved service instance: 127.0.0.1:49007 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/backend)
2019/12/06 06:42:56 [DEBUG] successfully connected to 127.0.0.1:49007 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/backend)
--- PASS: TestService_HTTPClient (0.32s)
=== RUN   TestService_HasDefaultHTTPResolverFromAddr
--- PASS: TestService_HasDefaultHTTPResolverFromAddr (0.00s)
=== RUN   Test_verifyServerCertMatchesURI
2019/12/06 06:42:56 [ERR] consul.watch: Watch (type: connect_roots) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/roots: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 5s
2019/12/06 06:42:56 [ERR] consul.watch: Watch (type: connect_leaf) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/leaf/foo: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 5s
=== RUN   Test_verifyServerCertMatchesURI/simple_match
=== RUN   Test_verifyServerCertMatchesURI/different_trust-domain_allowed
=== RUN   Test_verifyServerCertMatchesURI/mismatch
=== RUN   Test_verifyServerCertMatchesURI/no_certs
=== RUN   Test_verifyServerCertMatchesURI/nil_certs
--- PASS: Test_verifyServerCertMatchesURI (0.18s)
    --- PASS: Test_verifyServerCertMatchesURI/simple_match (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/different_trust-domain_allowed (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/mismatch (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/no_certs (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/nil_certs (0.00s)
=== RUN   TestClientSideVerifier
=== RUN   TestClientSideVerifier/ok_service_ca1
=== RUN   TestClientSideVerifier/untrusted_CA
=== RUN   TestClientSideVerifier/cross_signed_intermediate
=== RUN   TestClientSideVerifier/cross_signed_without_intermediate
--- PASS: TestClientSideVerifier (0.20s)
    --- PASS: TestClientSideVerifier/ok_service_ca1 (0.01s)
    --- PASS: TestClientSideVerifier/untrusted_CA (0.00s)
    --- PASS: TestClientSideVerifier/cross_signed_intermediate (0.03s)
    --- PASS: TestClientSideVerifier/cross_signed_without_intermediate (0.00s)
=== RUN   TestServerSideVerifier
WARNING: bootstrap = true: do not enable unless necessary
test-consul - 2019/12/06 06:42:56.771594 [WARN] agent: Node name "Node 4f97e3e0-c4f2-2789-fef5-fe4dd0672f50" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test-consul - 2019/12/06 06:42:56.772103 [DEBUG] tlsutil: Update with version 1
test-consul - 2019/12/06 06:42:56.774395 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:42:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4f97e3e0-c4f2-2789-fef5-fe4dd0672f50 Address:127.0.0.1:49013}]
2019/12/06 06:42:57 [INFO]  raft: Node at 127.0.0.1:49013 [Follower] entering Follower state (Leader: "")
test-consul - 2019/12/06 06:42:57.543287 [INFO] serf: EventMemberJoin: Node 4f97e3e0-c4f2-2789-fef5-fe4dd0672f50.dc1 127.0.0.1
test-consul - 2019/12/06 06:42:57.547485 [INFO] serf: EventMemberJoin: Node 4f97e3e0-c4f2-2789-fef5-fe4dd0672f50 127.0.0.1
test-consul - 2019/12/06 06:42:57.548328 [INFO] consul: Adding LAN server Node 4f97e3e0-c4f2-2789-fef5-fe4dd0672f50 (Addr: tcp/127.0.0.1:49013) (DC: dc1)
test-consul - 2019/12/06 06:42:57.548783 [INFO] consul: Handled member-join event for server "Node 4f97e3e0-c4f2-2789-fef5-fe4dd0672f50.dc1" in area "wan"
test-consul - 2019/12/06 06:42:57.550170 [INFO] agent: Started DNS server 127.0.0.1:49008 (tcp)
test-consul - 2019/12/06 06:42:57.550276 [INFO] agent: Started DNS server 127.0.0.1:49008 (udp)
test-consul - 2019/12/06 06:42:57.552724 [INFO] agent: Started HTTP server on 127.0.0.1:49009 (tcp)
test-consul - 2019/12/06 06:42:57.552812 [INFO] agent: started state syncer
2019/12/06 06:42:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:42:57 [INFO]  raft: Node at 127.0.0.1:49013 [Candidate] entering Candidate state in term 2
2019/12/06 06:42:57 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:42:57 [INFO]  raft: Node at 127.0.0.1:49013 [Leader] entering Leader state
test-consul - 2019/12/06 06:42:57.989775 [INFO] consul: cluster leadership acquired
test-consul - 2019/12/06 06:42:57.990179 [INFO] consul: New leader elected: Node 4f97e3e0-c4f2-2789-fef5-fe4dd0672f50
test-consul - 2019/12/06 06:42:58.265256 [INFO] agent: Synced node info
test-consul - 2019/12/06 06:42:58.265387 [DEBUG] agent: Node info in sync
test-consul - 2019/12/06 06:42:58.461883 [DEBUG] agent: Node info in sync
test-consul - 2019/12/06 06:42:59.165410 [INFO] connect: initialized primary datacenter CA with provider "consul"
test-consul - 2019/12/06 06:42:59.165929 [DEBUG] consul: Skipping self join check for "Node 4f97e3e0-c4f2-2789-fef5-fe4dd0672f50" since the cluster is too small
test-consul - 2019/12/06 06:42:59.166098 [INFO] consul: member 'Node 4f97e3e0-c4f2-2789-fef5-fe4dd0672f50' joined, marking health alive
test-consul - 2019/12/06 06:42:59.593457 [DEBUG] http: Request POST /v1/connect/intentions (230.162731ms) from=127.0.0.1:39940
test-consul - 2019/12/06 06:42:59.835186 [DEBUG] http: Request POST /v1/connect/intentions (237.830911ms) from=127.0.0.1:39940
=== RUN   TestServerSideVerifier/ok_service_ca1,_allow
test-consul - 2019/12/06 06:43:00.040473 [DEBUG] http: Request POST /v1/agent/connect/authorize (3.060739ms) from=127.0.0.1:39940
=== RUN   TestServerSideVerifier/untrusted_CA
2019/12/06 06:43:00 connect: failed TLS verification: x509: certificate signed by unknown authority
=== RUN   TestServerSideVerifier/cross_signed_intermediate,_allow
test-consul - 2019/12/06 06:43:00.109946 [DEBUG] http: Request POST /v1/agent/connect/authorize (1.72204ms) from=127.0.0.1:39940
=== RUN   TestServerSideVerifier/cross_signed_without_intermediate
2019/12/06 06:43:00 connect: failed TLS verification: x509: certificate signed by unknown authority
=== RUN   TestServerSideVerifier/ok_service_ca1,_deny
test-consul - 2019/12/06 06:43:00.135728 [DEBUG] http: Request POST /v1/agent/connect/authorize (1.004023ms) from=127.0.0.1:39940
2019/12/06 06:43:00 connect: authz call denied: Matched intention: DENY default/* => default/db (ID: efa27511-61d1-26ce-375c-ebfc0ba1b20d, Precedence: 8)
=== RUN   TestServerSideVerifier/cross_signed_intermediate,_deny
test-consul - 2019/12/06 06:43:00.193732 [DEBUG] http: Request POST /v1/agent/connect/authorize (2.133384ms) from=127.0.0.1:39940
2019/12/06 06:43:00 connect: authz call denied: Matched intention: DENY default/* => default/db (ID: efa27511-61d1-26ce-375c-ebfc0ba1b20d, Precedence: 8)
test-consul - 2019/12/06 06:43:00.195735 [INFO] agent: Requesting shutdown
test-consul - 2019/12/06 06:43:00.195890 [INFO] consul: shutting down server
test-consul - 2019/12/06 06:43:00.196009 [WARN] serf: Shutdown without a Leave
test-consul - 2019/12/06 06:43:00.314465 [WARN] serf: Shutdown without a Leave
test-consul - 2019/12/06 06:43:00.405399 [INFO] manager: shutting down
test-consul - 2019/12/06 06:43:00.406003 [INFO] agent: consul server down
test-consul - 2019/12/06 06:43:00.406220 [INFO] agent: shutdown complete
test-consul - 2019/12/06 06:43:00.406419 [INFO] agent: Stopping DNS server 127.0.0.1:49008 (tcp)
test-consul - 2019/12/06 06:43:00.406737 [INFO] agent: Stopping DNS server 127.0.0.1:49008 (udp)
test-consul - 2019/12/06 06:43:00.407096 [INFO] agent: Stopping HTTP server 127.0.0.1:49009 (tcp)
test-consul - 2019/12/06 06:43:00.407741 [INFO] agent: Waiting for endpoints to shut down
test-consul - 2019/12/06 06:43:00.407931 [INFO] agent: Endpoints down
--- PASS: TestServerSideVerifier (3.79s)
    --- PASS: TestServerSideVerifier/ok_service_ca1,_allow (0.02s)
    --- PASS: TestServerSideVerifier/untrusted_CA (0.00s)
    --- PASS: TestServerSideVerifier/cross_signed_intermediate,_allow (0.07s)
    --- PASS: TestServerSideVerifier/cross_signed_without_intermediate (0.00s)
    --- PASS: TestServerSideVerifier/ok_service_ca1,_deny (0.02s)
    --- PASS: TestServerSideVerifier/cross_signed_intermediate,_deny (0.06s)
=== RUN   TestDynamicTLSConfig
--- PASS: TestDynamicTLSConfig (0.11s)
=== RUN   TestDynamicTLSConfig_Ready
--- PASS: TestDynamicTLSConfig_Ready (0.11s)
PASS
ok  	github.com/hashicorp/consul/connect	9.314s
?   	github.com/hashicorp/consul/connect/certgen	[no test files]
=== RUN   TestUpstreamResolverFuncFromClient
=== PAUSE TestUpstreamResolverFuncFromClient
=== RUN   TestAgentConfigWatcherManagedProxy
=== PAUSE TestAgentConfigWatcherManagedProxy
=== RUN   TestAgentConfigWatcherSidecarProxy
=== PAUSE TestAgentConfigWatcherSidecarProxy
=== RUN   TestConn
--- SKIP: TestConn (0.00s)
    conn_test.go:67: DM-skipped
=== RUN   TestConnSrcClosing
=== PAUSE TestConnSrcClosing
=== RUN   TestConnDstClosing
=== PAUSE TestConnDstClosing
=== RUN   TestPublicListener
2019/12/06 06:42:58 test tcp server listening on localhost:19002
2019/12/06 06:42:58 [DEBUG] resolved service instance: localhost:19001 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db)
2019/12/06 06:42:58 [DEBUG] successfully connected to localhost:19001 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db)
2019/12/06 06:42:58 connect: nil client
2019/12/06 06:42:58 test tcp echo server 127.0.0.1:19002 stopped
--- PASS: TestPublicListener (0.26s)
=== RUN   TestUpstreamListener
--- SKIP: TestUpstreamListener (0.00s)
    listener_test.go:162: DM-skipped
=== RUN   TestProxy_public
--- SKIP: TestProxy_public (0.00s)
    proxy_test.go:22: DM-skipped
=== CONT  TestUpstreamResolverFuncFromClient
=== CONT  TestConnSrcClosing
=== CONT  TestConnDstClosing
=== RUN   TestUpstreamResolverFuncFromClient/service
=== CONT  TestAgentConfigWatcherSidecarProxy
=== RUN   TestUpstreamResolverFuncFromClient/prepared_query
=== RUN   TestUpstreamResolverFuncFromClient/unknown_behaves_like_service
--- PASS: TestUpstreamResolverFuncFromClient (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/service (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/prepared_query (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/unknown_behaves_like_service (0.00s)
=== CONT  TestAgentConfigWatcherManagedProxy
--- PASS: TestConnSrcClosing (0.01s)
--- PASS: TestConnDstClosing (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
agent_smith - 2019/12/06 06:42:58.993136 [WARN] agent: Node name "Node d86ab7f2-2859-6658-fbdb-c45b6f7a4590" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
agent_smith - 2019/12/06 06:42:58.994715 [DEBUG] tlsutil: Update with version 1
agent_smith - 2019/12/06 06:42:59.006397 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
agent_smith - 2019/12/06 06:42:59.015162 [WARN] agent: Node name "Node b11b1764-063b-4f2a-80e8-140563d426e4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
agent_smith - 2019/12/06 06:42:59.015873 [DEBUG] tlsutil: Update with version 1
agent_smith - 2019/12/06 06:42:59.019313 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/12/06 06:43:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b11b1764-063b-4f2a-80e8-140563d426e4 Address:127.0.0.1:19014}]
2019/12/06 06:43:00 [INFO]  raft: Node at 127.0.0.1:19014 [Follower] entering Follower state (Leader: "")
2019/12/06 06:43:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d86ab7f2-2859-6658-fbdb-c45b6f7a4590 Address:127.0.0.1:19008}]
2019/12/06 06:43:00 [INFO]  raft: Node at 127.0.0.1:19008 [Follower] entering Follower state (Leader: "")
agent_smith - 2019/12/06 06:43:00.291228 [INFO] serf: EventMemberJoin: Node b11b1764-063b-4f2a-80e8-140563d426e4.dc1 127.0.0.1
agent_smith - 2019/12/06 06:43:00.297645 [INFO] serf: EventMemberJoin: Node d86ab7f2-2859-6658-fbdb-c45b6f7a4590.dc1 127.0.0.1
agent_smith - 2019/12/06 06:43:00.303867 [INFO] serf: EventMemberJoin: Node b11b1764-063b-4f2a-80e8-140563d426e4 127.0.0.1
agent_smith - 2019/12/06 06:43:00.304970 [INFO] consul: Handled member-join event for server "Node b11b1764-063b-4f2a-80e8-140563d426e4.dc1" in area "wan"
agent_smith - 2019/12/06 06:43:00.305390 [INFO] consul: Adding LAN server Node b11b1764-063b-4f2a-80e8-140563d426e4 (Addr: tcp/127.0.0.1:19014) (DC: dc1)
agent_smith - 2019/12/06 06:43:00.306158 [INFO] agent: Started DNS server 127.0.0.1:19009 (tcp)
agent_smith - 2019/12/06 06:43:00.306249 [INFO] agent: Started DNS server 127.0.0.1:19009 (udp)
agent_smith - 2019/12/06 06:43:00.310193 [INFO] agent: Started HTTP server on 127.0.0.1:19010 (tcp)
agent_smith - 2019/12/06 06:43:00.310730 [INFO] agent: started state syncer
agent_smith - 2019/12/06 06:43:00.311278 [INFO] serf: EventMemberJoin: Node d86ab7f2-2859-6658-fbdb-c45b6f7a4590 127.0.0.1
agent_smith - 2019/12/06 06:43:00.312955 [INFO] agent: Started DNS server 127.0.0.1:19003 (udp)
agent_smith - 2019/12/06 06:43:00.313424 [INFO] consul: Adding LAN server Node d86ab7f2-2859-6658-fbdb-c45b6f7a4590 (Addr: tcp/127.0.0.1:19008) (DC: dc1)
agent_smith - 2019/12/06 06:43:00.313644 [INFO] consul: Handled member-join event for server "Node d86ab7f2-2859-6658-fbdb-c45b6f7a4590.dc1" in area "wan"
agent_smith - 2019/12/06 06:43:00.313459 [INFO] agent: Started DNS server 127.0.0.1:19003 (tcp)
2019/12/06 06:43:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/12/06 06:43:00 [INFO]  raft: Node at 127.0.0.1:19008 [Candidate] entering Candidate state in term 2
2019/12/06 06:43:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
agent_smith - 2019/12/06 06:43:00.321527 [INFO] agent: Started HTTP server on 127.0.0.1:19004 (tcp)
agent_smith - 2019/12/06 06:43:00.321639 [INFO] agent: started state syncer
2019/12/06 06:43:00 [INFO]  raft: Node at 127.0.0.1:19014 [Candidate] entering Candidate state in term 2
2019/12/06 06:43:01 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:43:01 [INFO]  raft: Node at 127.0.0.1:19008 [Leader] entering Leader state
agent_smith - 2019/12/06 06:43:01.158653 [INFO] consul: cluster leadership acquired
agent_smith - 2019/12/06 06:43:01.159409 [INFO] consul: New leader elected: Node d86ab7f2-2859-6658-fbdb-c45b6f7a4590
2019/12/06 06:43:01 [INFO]  raft: Election won. Tally: 1
2019/12/06 06:43:01 [INFO]  raft: Node at 127.0.0.1:19014 [Leader] entering Leader state
agent_smith - 2019/12/06 06:43:01.357099 [INFO] consul: cluster leadership acquired
agent_smith - 2019/12/06 06:43:01.357645 [INFO] consul: New leader elected: Node b11b1764-063b-4f2a-80e8-140563d426e4
agent_smith - 2019/12/06 06:43:02.095169 [INFO] agent: Synced node info
agent_smith - 2019/12/06 06:43:02.095455 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/06 06:43:02.100017 [INFO] agent: Synced service "web"
agent_smith - 2019/12/06 06:43:02.100111 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/06 06:43:02.100248 [DEBUG] agent: Service "web" in sync
agent_smith - 2019/12/06 06:43:02.100294 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/06 06:43:02.105387 [ERR] leaf watch error: invalid type for leaf response: <nil>
agent_smith - 2019/12/06 06:43:02.260990 [ERR] leaf watch error: invalid type for leaf response: <nil>
agent_smith - 2019/12/06 06:43:02.387969 [DEBUG] agent: Service "web" in sync
agent_smith - 2019/12/06 06:43:02.768833 [INFO] agent: Synced service "web"
agent_smith - 2019/12/06 06:43:02.917290 [INFO] agent: Synced service "web-proxy"
agent_smith - 2019/12/06 06:43:02.917398 [DEBUG] agent: Check "service:web-proxy" in sync
agent_smith - 2019/12/06 06:43:02.917434 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/06 06:43:02.917516 [DEBUG] http: Request PUT /v1/agent/service/register (1.485852513s) from=127.0.0.1:39326
agent_smith - 2019/12/06 06:43:02.921456 [DEBUG] http: Request GET /v1/agent/service/web-proxy (1.733374ms) from=127.0.0.1:39326
agent_smith - 2019/12/06 06:43:03.267969 [INFO] agent: Synced service "web-sidecar-proxy"
agent_smith - 2019/12/06 06:43:03.273041 [DEBUG] agent: Check "service:web-sidecar-proxy:1" in sync
agent_smith - 2019/12/06 06:43:03.273424 [DEBUG] agent: Check "service:web-sidecar-proxy:2" in sync
agent_smith - 2019/12/06 06:43:03.273667 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/06 06:43:03.273944 [DEBUG] http: Request PUT /v1/agent/service/register (1.841500854s) from=127.0.0.1:54838
agent_smith - 2019/12/06 06:43:03.289679 [DEBUG] http: Request GET /v1/agent/service/web-sidecar-proxy (12.898969ms) from=127.0.0.1:54838
agent_smith - 2019/12/06 06:43:03.342394 [DEBUG] http: Request GET /v1/agent/service/web-proxy?hash=e9b438f338ba1e43 (400.860068ms) from=127.0.0.1:39326
agent_smith - 2019/12/06 06:43:03.345541 [INFO] agent: Requesting shutdown
agent_smith - 2019/12/06 06:43:03.409291 [INFO] consul: shutting down server
agent_smith - 2019/12/06 06:43:03.409376 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/12/06 06:43:03.410522 [INFO] connect: initialized primary datacenter CA with provider "consul"
agent_smith - 2019/12/06 06:43:03.415694 [ERR] leaf watch error: invalid type for leaf response: <nil>
agent_smith - 2019/12/06 06:43:03.497657 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/12/06 06:43:03.498847 [INFO] connect: initialized primary datacenter CA with provider "consul"
agent_smith - 2019/12/06 06:43:03.499503 [DEBUG] consul: Skipping self join check for "Node d86ab7f2-2859-6658-fbdb-c45b6f7a4590" since the cluster is too small
agent_smith - 2019/12/06 06:43:03.499880 [INFO] consul: member 'Node d86ab7f2-2859-6658-fbdb-c45b6f7a4590' joined, marking health alive
agent_smith - 2019/12/06 06:43:03.589470 [INFO] manager: shutting down
agent_smith - 2019/12/06 06:43:03.648625 [WARN] agent: Syncing service "web" failed. leadership lost while committing log
agent_smith - 2019/12/06 06:43:03.648699 [ERR] agent: failed to sync changes: leadership lost while committing log
agent_smith - 2019/12/06 06:43:03.648778 [DEBUG] http: Request PUT /v1/agent/service/register (685.445742ms) from=127.0.0.1:39328
agent_smith - 2019/12/06 06:43:03.648993 [INFO] agent: consul server down
agent_smith - 2019/12/06 06:43:03.649143 [INFO] agent: shutdown complete
agent_smith - 2019/12/06 06:43:03.649348 [INFO] agent: Stopping DNS server 127.0.0.1:19009 (tcp)
agent_smith - 2019/12/06 06:43:03.649582 [INFO] agent: Stopping DNS server 127.0.0.1:19009 (udp)
agent_smith - 2019/12/06 06:43:03.649821 [INFO] agent: Stopping HTTP server 127.0.0.1:19010 (tcp)
agent_smith - 2019/12/06 06:43:03.740196 [DEBUG] http: Request GET /v1/agent/service/web-sidecar-proxy?hash=4a87c9bd1a9bd791 (440.47433ms) from=127.0.0.1:54838
agent_smith - 2019/12/06 06:43:03.743227 [INFO] agent: Requesting shutdown
agent_smith - 2019/12/06 06:43:03.890252 [INFO] agent: Synced service "web"
agent_smith - 2019/12/06 06:43:04.042622 [INFO] agent: Synced service "web-sidecar-proxy"
agent_smith - 2019/12/06 06:43:04.042708 [DEBUG] agent: Check "service:web-sidecar-proxy:1" in sync
agent_smith - 2019/12/06 06:43:04.042753 [DEBUG] agent: Check "service:web-sidecar-proxy:2" in sync
agent_smith - 2019/12/06 06:43:04.042783 [DEBUG] agent: Node info in sync
agent_smith - 2019/12/06 06:43:04.042850 [DEBUG] http: Request PUT /v1/agent/service/register (724.045313ms) from=127.0.0.1:54846
agent_smith - 2019/12/06 06:43:04.044636 [INFO] consul: shutting down server
agent_smith - 2019/12/06 06:43:04.044730 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/12/06 06:43:04.089364 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/12/06 06:43:04.172797 [INFO] manager: shutting down
agent_smith - 2019/12/06 06:43:04.173224 [INFO] agent: consul server down
agent_smith - 2019/12/06 06:43:04.173280 [INFO] agent: shutdown complete
agent_smith - 2019/12/06 06:43:04.173332 [INFO] agent: Stopping DNS server 127.0.0.1:19003 (tcp)
agent_smith - 2019/12/06 06:43:04.173464 [INFO] agent: Stopping DNS server 127.0.0.1:19003 (udp)
agent_smith - 2019/12/06 06:43:04.173625 [INFO] agent: Stopping HTTP server 127.0.0.1:19004 (tcp)
agent_smith - 2019/12/06 06:43:04.650151 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:19010 (tcp)
agent_smith - 2019/12/06 06:43:04.650219 [INFO] agent: Waiting for endpoints to shut down
agent_smith - 2019/12/06 06:43:04.650257 [INFO] agent: Endpoints down
--- PASS: TestAgentConfigWatcherManagedProxy (5.76s)
agent_smith - 2019/12/06 06:43:05.173943 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:19004 (tcp)
agent_smith - 2019/12/06 06:43:05.174022 [INFO] agent: Waiting for endpoints to shut down
agent_smith - 2019/12/06 06:43:05.174062 [INFO] agent: Endpoints down
--- PASS: TestAgentConfigWatcherSidecarProxy (6.28s)
PASS
ok  	github.com/hashicorp/consul/connect/proxy	6.833s
=== RUN   TestIsPrivateIP
=== RUN   TestIsPrivateIP/10.0.0.1
=== RUN   TestIsPrivateIP/100.64.0.1
=== RUN   TestIsPrivateIP/172.16.0.1
=== RUN   TestIsPrivateIP/192.168.0.1
=== RUN   TestIsPrivateIP/192.0.0.1
=== RUN   TestIsPrivateIP/192.0.2.1
=== RUN   TestIsPrivateIP/127.0.0.1
=== RUN   TestIsPrivateIP/169.254.0.1
=== RUN   TestIsPrivateIP/1.2.3.4
=== RUN   TestIsPrivateIP/::1
=== RUN   TestIsPrivateIP/fe80::1
=== RUN   TestIsPrivateIP/fc00::1
=== RUN   TestIsPrivateIP/fec0::1
=== RUN   TestIsPrivateIP/2001:db8::1
=== RUN   TestIsPrivateIP/2004:db6::1
--- PASS: TestIsPrivateIP (0.03s)
    --- PASS: TestIsPrivateIP/10.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/100.64.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/172.16.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.168.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.0.2.1 (0.00s)
    --- PASS: TestIsPrivateIP/127.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/169.254.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/1.2.3.4 (0.00s)
    --- PASS: TestIsPrivateIP/::1 (0.00s)
    --- PASS: TestIsPrivateIP/fe80::1 (0.00s)
    --- PASS: TestIsPrivateIP/fc00::1 (0.00s)
    --- PASS: TestIsPrivateIP/fec0::1 (0.00s)
    --- PASS: TestIsPrivateIP/2001:db8::1 (0.00s)
    --- PASS: TestIsPrivateIP/2004:db6::1 (0.00s)
=== RUN   TestIsIPv6
=== RUN   TestIsIPv6/10.0.0.1
=== RUN   TestIsIPv6/100.64.0.1
=== RUN   TestIsIPv6/172.16.0.1
=== RUN   TestIsIPv6/192.168.0.1
=== RUN   TestIsIPv6/192.0.0.1
=== RUN   TestIsIPv6/192.0.2.1
=== RUN   TestIsIPv6/127.0.0.1
=== RUN   TestIsIPv6/169.254.0.1
=== RUN   TestIsIPv6/1.2.3.4
=== RUN   TestIsIPv6/::1
=== RUN   TestIsIPv6/fe80::1
=== RUN   TestIsIPv6/fc00::1
=== RUN   TestIsIPv6/fec0::1
=== RUN   TestIsIPv6/2001:db8::1
=== RUN   TestIsIPv6/2004:db6::1
=== RUN   TestIsIPv6/example.com
=== RUN   TestIsIPv6/localhost
=== RUN   TestIsIPv6/1.257.0.1
--- PASS: TestIsIPv6 (0.02s)
    --- PASS: TestIsIPv6/10.0.0.1 (0.00s)
    --- PASS: TestIsIPv6/100.64.0.1 (0.00s)
    --- PASS: TestIsIPv6/172.16.0.1 (0.00s)
    --- PASS: TestIsIPv6/192.168.0.1 (0.00s)
    --- PASS: TestIsIPv6/192.0.0.1 (0.00s)
    --- PASS: TestIsIPv6/192.0.2.1 (0.00s)
    --- PASS: TestIsIPv6/127.0.0.1 (0.00s)
    --- PASS: TestIsIPv6/169.254.0.1 (0.00s)
    --- PASS: TestIsIPv6/1.2.3.4 (0.00s)
    --- PASS: TestIsIPv6/::1 (0.00s)
    --- PASS: TestIsIPv6/fe80::1 (0.00s)
    --- PASS: TestIsIPv6/fc00::1 (0.00s)
    --- PASS: TestIsIPv6/fec0::1 (0.00s)
    --- PASS: TestIsIPv6/2001:db8::1 (0.00s)
    --- PASS: TestIsIPv6/2004:db6::1 (0.00s)
    --- PASS: TestIsIPv6/example.com (0.00s)
    --- PASS: TestIsIPv6/localhost (0.00s)
    --- PASS: TestIsIPv6/1.257.0.1 (0.00s)
PASS
ok  	github.com/hashicorp/consul/ipaddr	0.116s
=== RUN   TestDurationMinusBuffer
--- PASS: TestDurationMinusBuffer (0.00s)
=== RUN   TestDurationMinusBufferDomain
--- PASS: TestDurationMinusBufferDomain (0.00s)
=== RUN   TestRandomStagger
--- PASS: TestRandomStagger (0.00s)
=== RUN   TestRateScaledInterval
--- PASS: TestRateScaledInterval (0.00s)
=== RUN   TestMapWalk
--- SKIP: TestMapWalk (0.00s)
    map_walker_test.go:10: DM-skipped
=== RUN   TestJitterRandomStagger
=== PAUSE TestJitterRandomStagger
=== RUN   TestRetryWaiter_calculateWait
=== PAUSE TestRetryWaiter_calculateWait
=== RUN   TestRetryWaiter_WaitChans
=== PAUSE TestRetryWaiter_WaitChans
=== RUN   TestRTT_ComputeDistance
=== RUN   TestRTT_ComputeDistance/10_ms
=== RUN   TestRTT_ComputeDistance/0_ms
=== RUN   TestRTT_ComputeDistance/2_ms
=== RUN   TestRTT_ComputeDistance/2_ms_reversed
=== RUN   TestRTT_ComputeDistance/a_nil
=== RUN   TestRTT_ComputeDistance/b_nil
=== RUN   TestRTT_ComputeDistance/both_nil
--- PASS: TestRTT_ComputeDistance (0.00s)
    --- PASS: TestRTT_ComputeDistance/10_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/0_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/2_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/2_ms_reversed (0.00s)
    --- PASS: TestRTT_ComputeDistance/a_nil (0.00s)
    --- PASS: TestRTT_ComputeDistance/b_nil (0.00s)
    --- PASS: TestRTT_ComputeDistance/both_nil (0.00s)
=== RUN   TestRTT_Intersect
=== RUN   TestRTT_Intersect/nil_maps
=== RUN   TestRTT_Intersect/two_servers
=== RUN   TestRTT_Intersect/two_clients
=== RUN   TestRTT_Intersect/server1_and_client_alpha
=== RUN   TestRTT_Intersect/server1_and_client_beta_1
=== RUN   TestRTT_Intersect/server1_and_client_alpha_reversed
=== RUN   TestRTT_Intersect/server1_and_client_beta_1_reversed
=== RUN   TestRTT_Intersect/nothing_in_common
=== RUN   TestRTT_Intersect/nothing_in_common_reversed
--- PASS: TestRTT_Intersect (0.00s)
    --- PASS: TestRTT_Intersect/nil_maps (0.00s)
    --- PASS: TestRTT_Intersect/two_servers (0.00s)
    --- PASS: TestRTT_Intersect/two_clients (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_alpha (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_beta_1 (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_alpha_reversed (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_beta_1_reversed (0.00s)
    --- PASS: TestRTT_Intersect/nothing_in_common (0.00s)
    --- PASS: TestRTT_Intersect/nothing_in_common_reversed (0.00s)
=== RUN   TestStrContains
--- PASS: TestStrContains (0.00s)
=== RUN   TestTelemetryConfig_MergeDefaults
=== RUN   TestTelemetryConfig_MergeDefaults/basic_merge
=== RUN   TestTelemetryConfig_MergeDefaults/exhaustive
--- PASS: TestTelemetryConfig_MergeDefaults (0.00s)
    --- PASS: TestTelemetryConfig_MergeDefaults/basic_merge (0.00s)
    --- PASS: TestTelemetryConfig_MergeDefaults/exhaustive (0.00s)
=== RUN   TestTranslateKeys
=== RUN   TestTranslateKeys/x->y
=== RUN   TestTranslateKeys/discard_x
=== RUN   TestTranslateKeys/b.x->b.y
=== RUN   TestTranslateKeys/json:_x->y
=== RUN   TestTranslateKeys/json:_X->y
=== RUN   TestTranslateKeys/json:_discard_x
=== RUN   TestTranslateKeys/json:_b.x->b.y
=== RUN   TestTranslateKeys/json:_b[0].x->b[0].y
--- PASS: TestTranslateKeys (0.01s)
    --- PASS: TestTranslateKeys/x->y (0.00s)
    --- PASS: TestTranslateKeys/discard_x (0.00s)
    --- PASS: TestTranslateKeys/b.x->b.y (0.00s)
    --- PASS: TestTranslateKeys/json:_x->y (0.00s)
    --- PASS: TestTranslateKeys/json:_X->y (0.00s)
    --- PASS: TestTranslateKeys/json:_discard_x (0.00s)
    --- PASS: TestTranslateKeys/json:_b.x->b.y (0.00s)
    --- PASS: TestTranslateKeys/json:_b[0].x->b[0].y (0.00s)
=== RUN   TestUserAgent
--- PASS: TestUserAgent (0.00s)
=== RUN   TestMathAbsInt
--- PASS: TestMathAbsInt (0.00s)
=== RUN   TestMathMaxInt
--- PASS: TestMathMaxInt (0.00s)
=== RUN   TestMathMinInt
--- PASS: TestMathMinInt (0.00s)
=== CONT  TestJitterRandomStagger
=== RUN   TestJitterRandomStagger/0_percent
=== PAUSE TestJitterRandomStagger/0_percent
=== RUN   TestJitterRandomStagger/10_percent
=== PAUSE TestJitterRandomStagger/10_percent
=== RUN   TestJitterRandomStagger/100_percent
=== PAUSE TestJitterRandomStagger/100_percent
=== CONT  TestJitterRandomStagger/0_percent
=== CONT  TestRetryWaiter_WaitChans
=== RUN   TestRetryWaiter_WaitChans/Minimum_Wait_-_Success
=== PAUSE TestRetryWaiter_WaitChans/Minimum_Wait_-_Success
=== RUN   TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIf
=== PAUSE TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIf
=== RUN   TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIfErr
=== PAUSE TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIfErr
=== RUN   TestRetryWaiter_WaitChans/Maximum_Wait_-_Failed
=== PAUSE TestRetryWaiter_WaitChans/Maximum_Wait_-_Failed
=== RUN   TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIf
=== PAUSE TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIf
=== RUN   TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIfErr
=== PAUSE TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIfErr
=== CONT  TestRetryWaiter_WaitChans/Minimum_Wait_-_Success
=== CONT  TestJitterRandomStagger/100_percent
=== CONT  TestJitterRandomStagger/10_percent
--- PASS: TestJitterRandomStagger (0.00s)
    --- PASS: TestJitterRandomStagger/0_percent (0.00s)
    --- PASS: TestJitterRandomStagger/100_percent (0.00s)
    --- PASS: TestJitterRandomStagger/10_percent (0.00s)
=== CONT  TestRetryWaiter_calculateWait
=== RUN   TestRetryWaiter_calculateWait/Defaults
=== PAUSE TestRetryWaiter_calculateWait/Defaults
=== RUN   TestRetryWaiter_calculateWait/Minimum_Wait
=== PAUSE TestRetryWaiter_calculateWait/Minimum_Wait
=== RUN   TestRetryWaiter_calculateWait/Minimum_Failures
=== PAUSE TestRetryWaiter_calculateWait/Minimum_Failures
=== RUN   TestRetryWaiter_calculateWait/Maximum_Wait
=== PAUSE TestRetryWaiter_calculateWait/Maximum_Wait
=== CONT  TestRetryWaiter_calculateWait/Defaults
=== CONT  TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIf
=== CONT  TestRetryWaiter_calculateWait/Maximum_Wait
=== CONT  TestRetryWaiter_calculateWait/Minimum_Wait
=== CONT  TestRetryWaiter_calculateWait/Minimum_Failures
=== CONT  TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIfErr
=== CONT  TestRetryWaiter_WaitChans/Maximum_Wait_-_Failed
--- PASS: TestRetryWaiter_calculateWait (0.00s)
    --- PASS: TestRetryWaiter_calculateWait/Defaults (0.00s)
    --- PASS: TestRetryWaiter_calculateWait/Maximum_Wait (0.00s)
    --- PASS: TestRetryWaiter_calculateWait/Minimum_Wait (0.00s)
    --- PASS: TestRetryWaiter_calculateWait/Minimum_Failures (0.00s)
=== CONT  TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIfErr
=== CONT  TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIf
--- PASS: TestRetryWaiter_WaitChans (0.00s)
    --- PASS: TestRetryWaiter_WaitChans/Minimum_Wait_-_Success (0.20s)
    --- PASS: TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIf (0.21s)
    --- PASS: TestRetryWaiter_WaitChans/Minimum_Wait_-_WaitIfErr (0.20s)
    --- PASS: TestRetryWaiter_WaitChans/Maximum_Wait_-_Failed (0.25s)
    --- PASS: TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIfErr (0.25s)
    --- PASS: TestRetryWaiter_WaitChans/Maximum_Wait_-_WaitIf (0.25s)
PASS
ok  	github.com/hashicorp/consul/lib	0.581s
=== RUN   TestWriteAtomic
--- PASS: TestWriteAtomic (0.20s)
PASS
ok  	github.com/hashicorp/consul/lib/file	0.239s
=== RUN   TestDynamic
=== PAUSE TestDynamic
=== RUN   TestDynamicPanic
=== PAUSE TestDynamicPanic
=== RUN   TestDynamicAcquire
=== PAUSE TestDynamicAcquire
=== CONT  TestDynamic
=== CONT  TestDynamicAcquire
=== CONT  TestDynamicPanic
--- PASS: TestDynamicPanic (0.00s)
--- PASS: TestDynamicAcquire (0.05s)
--- PASS: TestDynamic (1.76s)
PASS
ok  	github.com/hashicorp/consul/lib/semaphore	1.801s
=== RUN   TestGatedWriter_impl
--- PASS: TestGatedWriter_impl (0.00s)
=== RUN   TestGatedWriter
--- PASS: TestGatedWriter (0.00s)
=== RUN   TestGRPCLogger
--- PASS: TestGRPCLogger (0.00s)
=== RUN   TestGRPCLogger_V
=== RUN   TestGRPCLogger_V/ERR,-1
=== RUN   TestGRPCLogger_V/ERR,0
=== RUN   TestGRPCLogger_V/ERR,1
=== RUN   TestGRPCLogger_V/ERR,2
=== RUN   TestGRPCLogger_V/ERR,3
=== RUN   TestGRPCLogger_V/WARN,-1
=== RUN   TestGRPCLogger_V/WARN,0
=== RUN   TestGRPCLogger_V/WARN,1
=== RUN   TestGRPCLogger_V/WARN,2
=== RUN   TestGRPCLogger_V/WARN,3
=== RUN   TestGRPCLogger_V/INFO,-1
=== RUN   TestGRPCLogger_V/INFO,0
=== RUN   TestGRPCLogger_V/INFO,1
=== RUN   TestGRPCLogger_V/INFO,2
=== RUN   TestGRPCLogger_V/INFO,3
=== RUN   TestGRPCLogger_V/DEBUG,-1
=== RUN   TestGRPCLogger_V/DEBUG,0
=== RUN   TestGRPCLogger_V/DEBUG,1
=== RUN   TestGRPCLogger_V/DEBUG,2
=== RUN   TestGRPCLogger_V/DEBUG,3
=== RUN   TestGRPCLogger_V/TRACE,-1
=== RUN   TestGRPCLogger_V/TRACE,0
=== RUN   TestGRPCLogger_V/TRACE,1
=== RUN   TestGRPCLogger_V/TRACE,2
=== RUN   TestGRPCLogger_V/TRACE,3
--- PASS: TestGRPCLogger_V (0.01s)
    --- PASS: TestGRPCLogger_V/ERR,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,0 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,1 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,2 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,3 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,0 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,1 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,2 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,3 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,0 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,1 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,2 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,3 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,0 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,1 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,2 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,3 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,0 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,1 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,2 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,3 (0.00s)
=== RUN   TestLogWriter
--- PASS: TestLogWriter (0.00s)
=== RUN   TestLogFile_timeRotation
=== PAUSE TestLogFile_timeRotation
=== RUN   TestLogFile_openNew
=== PAUSE TestLogFile_openNew
=== RUN   TestLogFile_byteRotation
=== PAUSE TestLogFile_byteRotation
=== RUN   TestLogFile_logLevelFiltering
=== PAUSE TestLogFile_logLevelFiltering
=== CONT  TestLogFile_timeRotation
=== CONT  TestLogFile_logLevelFiltering
=== CONT  TestLogFile_byteRotation
=== CONT  TestLogFile_openNew
--- PASS: TestLogFile_openNew (0.00s)
--- PASS: TestLogFile_logLevelFiltering (0.00s)
--- PASS: TestLogFile_byteRotation (0.00s)
--- PASS: TestLogFile_timeRotation (2.00s)
PASS
ok  	github.com/hashicorp/consul/logger	2.054s
?   	github.com/hashicorp/consul/sdk/freeport	[no test files]
?   	github.com/hashicorp/consul/sdk/testutil	[no test files]
=== RUN   TestRetryer
=== RUN   TestRetryer/counter
=== RUN   TestRetryer/timer
--- PASS: TestRetryer (0.40s)
    --- PASS: TestRetryer/counter (0.20s)
    --- PASS: TestRetryer/timer (0.20s)
PASS
ok  	github.com/hashicorp/consul/sdk/testutil/retry	0.422s
?   	github.com/hashicorp/consul/sentinel	[no test files]
?   	github.com/hashicorp/consul/service_os	[no test files]
=== RUN   TestArchive
--- PASS: TestArchive (0.00s)
=== RUN   TestArchive_GoodData
--- PASS: TestArchive_GoodData (0.11s)
=== RUN   TestArchive_BadData
--- PASS: TestArchive_BadData (0.02s)
=== RUN   TestArchive_hashList
--- PASS: TestArchive_hashList (0.01s)
=== RUN   TestSnapshot
2019-12-06T06:43:02.971Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-0ef6027c-1109-2cb1-cb56-3ba574d7e679 Address:0ef6027c-1109-2cb1-cb56-3ba574d7e679}]
2019-12-06T06:43:02.973Z [INFO]  raft: Node at 0ef6027c-1109-2cb1-cb56-3ba574d7e679 [Follower] entering Follower state (Leader: "")
2019-12-06T06:43:04.800Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-12-06T06:43:04.801Z [INFO]  raft: Node at 0ef6027c-1109-2cb1-cb56-3ba574d7e679 [Candidate] entering Candidate state in term 2
2019-12-06T06:43:04.801Z [DEBUG] raft: Votes needed: 1
2019-12-06T06:43:04.801Z [DEBUG] raft: Vote granted from server-0ef6027c-1109-2cb1-cb56-3ba574d7e679 in term 2. Tally: 1
2019-12-06T06:43:04.801Z [INFO]  raft: Election won. Tally: 1
2019-12-06T06:43:04.801Z [INFO]  raft: Node at 0ef6027c-1109-2cb1-cb56-3ba574d7e679 [Leader] entering Leader state
2019-12-06T06:43:16.601Z [INFO]  raft: Starting snapshot up to 65538
2019/12/06 06:43:16 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot-snapshot909265606/before/snapshots/2-65538-1575614596601.tmp
2019-12-06T06:43:17.723Z [INFO]  raft: Compacting logs from 1 to 55298
2019-12-06T06:43:17.755Z [INFO]  raft: Snapshot to 65538 complete
2019-12-06T06:43:30.871Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-1664a65d-a94b-9a05-6ea4-5cac200c8fbc Address:1664a65d-a94b-9a05-6ea4-5cac200c8fbc}]
2019-12-06T06:43:30.872Z [INFO]  raft: Node at 1664a65d-a94b-9a05-6ea4-5cac200c8fbc [Follower] entering Follower state (Leader: "")
2019-12-06T06:43:32.537Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-12-06T06:43:32.537Z [INFO]  raft: Node at 1664a65d-a94b-9a05-6ea4-5cac200c8fbc [Candidate] entering Candidate state in term 2
2019-12-06T06:43:32.537Z [DEBUG] raft: Votes needed: 1
2019-12-06T06:43:32.537Z [DEBUG] raft: Vote granted from server-1664a65d-a94b-9a05-6ea4-5cac200c8fbc in term 2. Tally: 1
2019-12-06T06:43:32.537Z [INFO]  raft: Election won. Tally: 1
2019-12-06T06:43:32.537Z [INFO]  raft: Node at 1664a65d-a94b-9a05-6ea4-5cac200c8fbc [Leader] entering Leader state
2019/12/06 06:43:34 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot-snapshot909265606/after/snapshots/2-65539-1575614614956.tmp
2019-12-06T06:43:35.765Z [INFO]  raft: Copied 16973829 bytes to local snapshot
2019-12-06T06:43:36.556Z [INFO]  raft: Restored user snapshot (index 65539)
--- PASS: TestSnapshot (33.76s)
=== RUN   TestSnapshot_Nil
--- PASS: TestSnapshot_Nil (0.00s)
=== RUN   TestSnapshot_BadVerify
--- PASS: TestSnapshot_BadVerify (0.00s)
=== RUN   TestSnapshot_BadRestore
2019-12-06T06:43:36.716Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-23c6395b-4f00-ca3e-1133-556f467ea984 Address:23c6395b-4f00-ca3e-1133-556f467ea984}]
2019-12-06T06:43:36.717Z [INFO]  raft: Node at 23c6395b-4f00-ca3e-1133-556f467ea984 [Follower] entering Follower state (Leader: "")
2019-12-06T06:43:38.236Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-12-06T06:43:38.236Z [INFO]  raft: Node at 23c6395b-4f00-ca3e-1133-556f467ea984 [Candidate] entering Candidate state in term 2
2019-12-06T06:43:38.236Z [DEBUG] raft: Votes needed: 1
2019-12-06T06:43:38.237Z [DEBUG] raft: Vote granted from server-23c6395b-4f00-ca3e-1133-556f467ea984 in term 2. Tally: 1
2019-12-06T06:43:38.237Z [INFO]  raft: Election won. Tally: 1
2019-12-06T06:43:38.237Z [INFO]  raft: Node at 23c6395b-4f00-ca3e-1133-556f467ea984 [Leader] entering Leader state
2019-12-06T06:43:41.102Z [INFO]  raft: Starting snapshot up to 16386
2019/12/06 06:43:41 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot_BadRestore-snapshot090631719/before/snapshots/2-16386-1575614621102.tmp
2019-12-06T06:43:41.615Z [INFO]  raft: Compacting logs from 1 to 6146
2019-12-06T06:43:41.618Z [INFO]  raft: Snapshot to 16386 complete
2019-12-06T06:43:44.417Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-a148d9c7-4969-1bde-b846-3a9e7b2d08e4 Address:a148d9c7-4969-1bde-b846-3a9e7b2d08e4}]
2019-12-06T06:43:44.417Z [INFO]  raft: Node at a148d9c7-4969-1bde-b846-3a9e7b2d08e4 [Follower] entering Follower state (Leader: "")
2019-12-06T06:43:45.544Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-12-06T06:43:45.544Z [INFO]  raft: Node at a148d9c7-4969-1bde-b846-3a9e7b2d08e4 [Candidate] entering Candidate state in term 2
2019-12-06T06:43:45.544Z [DEBUG] raft: Votes needed: 1
2019-12-06T06:43:45.544Z [DEBUG] raft: Vote granted from server-a148d9c7-4969-1bde-b846-3a9e7b2d08e4 in term 2. Tally: 1
2019-12-06T06:43:45.545Z [INFO]  raft: Election won. Tally: 1
2019-12-06T06:43:45.545Z [INFO]  raft: Node at a148d9c7-4969-1bde-b846-3a9e7b2d08e4 [Leader] entering Leader state
[ERR] snapshot: Failed to close snapshot decompressor: unexpected EOF
--- PASS: TestSnapshot_BadRestore (8.84s)
PASS
ok  	github.com/hashicorp/consul/snapshot	42.859s
?   	github.com/hashicorp/consul/testrpc	[no test files]
=== RUN   TestConfigurator_outgoingWrapper_OK
--- PASS: TestConfigurator_outgoingWrapper_OK (0.26s)
=== RUN   TestConfigurator_outgoingWrapper_noverify_OK
--- PASS: TestConfigurator_outgoingWrapper_noverify_OK (0.16s)
=== RUN   TestConfigurator_outgoingWrapper_BadDC
--- PASS: TestConfigurator_outgoingWrapper_BadDC (0.14s)
=== RUN   TestConfigurator_outgoingWrapper_BadCert
--- PASS: TestConfigurator_outgoingWrapper_BadCert (0.22s)
=== RUN   TestConfigurator_wrapTLS_OK
--- PASS: TestConfigurator_wrapTLS_OK (0.17s)
=== RUN   TestConfigurator_wrapTLS_BadCert
--- PASS: TestConfigurator_wrapTLS_BadCert (0.18s)
=== RUN   TestConfig_ParseCiphers
--- PASS: TestConfig_ParseCiphers (0.00s)
=== RUN   TestConfigurator_loadKeyPair
--- PASS: TestConfigurator_loadKeyPair (0.01s)
=== RUN   TestConfig_SpecifyDC
--- PASS: TestConfig_SpecifyDC (0.00s)
=== RUN   TestConfigurator_NewConfigurator
--- PASS: TestConfigurator_NewConfigurator (0.00s)
=== RUN   TestConfigurator_ErrorPropagation
--- PASS: TestConfigurator_ErrorPropagation (0.06s)
=== RUN   TestConfigurator_CommonTLSConfigServerNameNodeName
--- PASS: TestConfigurator_CommonTLSConfigServerNameNodeName (0.00s)
=== RUN   TestConfigurator_loadCAs
--- PASS: TestConfigurator_loadCAs (0.01s)
=== RUN   TestConfigurator_CommonTLSConfigInsecureSkipVerify
--- PASS: TestConfigurator_CommonTLSConfigInsecureSkipVerify (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigPreferServerCipherSuites
--- PASS: TestConfigurator_CommonTLSConfigPreferServerCipherSuites (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigCipherSuites
--- PASS: TestConfigurator_CommonTLSConfigCipherSuites (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigGetClientCertificate
--- PASS: TestConfigurator_CommonTLSConfigGetClientCertificate (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigCAs
--- PASS: TestConfigurator_CommonTLSConfigCAs (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigTLSMinVersion
--- PASS: TestConfigurator_CommonTLSConfigTLSMinVersion (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigVerifyIncoming
--- PASS: TestConfigurator_CommonTLSConfigVerifyIncoming (0.00s)
=== RUN   TestConfigurator_OutgoingRPCTLSDisabled
--- PASS: TestConfigurator_OutgoingRPCTLSDisabled (0.00s)
=== RUN   TestConfigurator_VerifyIncomingRPC
--- PASS: TestConfigurator_VerifyIncomingRPC (0.00s)
=== RUN   TestConfigurator_VerifyIncomingHTTPS
--- PASS: TestConfigurator_VerifyIncomingHTTPS (0.00s)
=== RUN   TestConfigurator_EnableAgentTLSForChecks
--- PASS: TestConfigurator_EnableAgentTLSForChecks (0.00s)
=== RUN   TestConfigurator_IncomingRPCConfig
--- PASS: TestConfigurator_IncomingRPCConfig (0.00s)
=== RUN   TestConfigurator_IncomingHTTPSConfig
--- PASS: TestConfigurator_IncomingHTTPSConfig (0.00s)
=== RUN   TestConfigurator_OutgoingTLSConfigForChecks
--- PASS: TestConfigurator_OutgoingTLSConfigForChecks (0.00s)
=== RUN   TestConfigurator_OutgoingRPCConfig
--- PASS: TestConfigurator_OutgoingRPCConfig (0.00s)
=== RUN   TestConfigurator_OutgoingRPCWrapper
--- PASS: TestConfigurator_OutgoingRPCWrapper (0.00s)
    config_test.go:699: TODO: actually call wrap here eventually
=== RUN   TestConfigurator_UpdateChecks
--- PASS: TestConfigurator_UpdateChecks (0.00s)
=== RUN   TestConfigurator_UpdateSetsStuff
--- PASS: TestConfigurator_UpdateSetsStuff (0.01s)
=== RUN   TestConfigurator_ServerNameOrNodeName
--- PASS: TestConfigurator_ServerNameOrNodeName (0.00s)
=== RUN   TestConfigurator_VerifyOutgoing
--- PASS: TestConfigurator_VerifyOutgoing (0.00s)
=== RUN   TestConfigurator_Domain
--- PASS: TestConfigurator_Domain (0.00s)
=== RUN   TestConfigurator_VerifyServerHostname
--- PASS: TestConfigurator_VerifyServerHostname (0.00s)
=== RUN   TestConfigurator_AutoEncrytCertExpired
--- PASS: TestConfigurator_AutoEncrytCertExpired (0.04s)
=== RUN   TestSerialNumber
--- PASS: TestSerialNumber (0.00s)
=== RUN   TestGeneratePrivateKey
=== PAUSE TestGeneratePrivateKey
=== RUN   TestGenerateCA
=== PAUSE TestGenerateCA
=== RUN   TestGenerateCert
--- SKIP: TestGenerateCert (0.00s)
    generate_test.go:102: DM-skipped
=== CONT  TestGeneratePrivateKey
=== CONT  TestGenerateCA
--- PASS: TestGeneratePrivateKey (0.01s)
--- PASS: TestGenerateCA (0.01s)
PASS
ok  	github.com/hashicorp/consul/tlsutil	1.328s
?   	github.com/hashicorp/consul/types	[no test files]
?   	github.com/hashicorp/consul/version	[no test files]
FAIL
dh_auto_test: cd _build && go test -vet=off -v -p 4 -short -failfast -timeout 7m github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/authmethod github.com/hashicorp/consul/command/acl/authmethod/create github.com/hashicorp/consul/command/acl/authmethod/delete github.com/hashicorp/consul/command/acl/authmethod/list github.com/hashicorp/consul/command/acl/authmethod/read github.com/hashicorp/consul/command/acl/authmethod/update github.com/hashicorp/consul/command/acl/bindingrule github.com/hashicorp/consul/command/acl/bindingrule/create github.com/hashicorp/consul/command/acl/bindingrule/delete github.com/hashicorp/consul/command/acl/bindingrule/list github.com/hashicorp/consul/command/acl/bindingrule/read github.com/hashicorp/consul/command/acl/bindingrule/update github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/role github.com/hashicorp/consul/command/acl/role/create github.com/hashicorp/consul/command/acl/role/delete github.com/hashicorp/consul/command/acl/role/list github.com/hashicorp/consul/command/acl/role/read github.com/hashicorp/consul/command/acl/role/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/config github.com/hashicorp/consul/command/config/delete github.com/hashicorp/consul/command/config/list github.com/hashicorp/consul/command/config/read github.com/hashicorp/consul/command/config/write github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/envoy/pipe-bootstrap github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/login github.com/hashicorp/consul/command/logout github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sdk/freeport github.com/hashicorp/consul/sdk/testutil github.com/hashicorp/consul/sdk/testutil/retry github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version returned exit code 1
make[1]: *** [debian/rules:51: override_dh_auto_test] Error 255
make[1]: Leaving directory '/<<BUILDDIR>>/consul-1.5.2+dfsg1'
make: *** [debian/rules:13: build-arch] Error 2
dpkg-buildpackage: error: debian/rules build-arch subprocess returned exit status 2
--------------------------------------------------------------------------------
Build finished at 2019-12-06T06:43:53Z

Finished
--------


+------------------------------------------------------------------------------+
| Cleanup                                                                      |
+------------------------------------------------------------------------------+

Purging /<<BUILDDIR>>
Not cleaning session: cloned chroot in use
E: Build failure (dpkg-buildpackage died)

+------------------------------------------------------------------------------+
| Summary                                                                      |
+------------------------------------------------------------------------------+

Build Architecture: armhf
Build-Space: 0
Build-Time: 3668
Distribution: bullseye-staging
Fail-Stage: build
Host Architecture: armhf
Install-Time: 1947
Job: consul_1.5.2+dfsg1-6
Machine Architecture: armhf
Package: consul
Package-Time: 5675
Source-Version: 1.5.2+dfsg1-6
Space: 0
Status: failed
Version: 1.5.2+dfsg1-6
--------------------------------------------------------------------------------
Finished at 2019-12-06T06:43:53Z
Build needed 00:00:00, 0k disc space