Skip to content

Instantly share code, notes, and snippets.

@mbukatov
Last active August 15, 2019 17:10
Show Gist options
  • Save mbukatov/d86f62089614c911f73a1307f0384333 to your computer and use it in GitHub Desktop.
Save mbukatov/d86f62089614c911f73a1307f0384333 to your computer and use it in GitHub Desktop.
ocs-ci 499
============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /home/ocsqe/projects/ocs-ci-deploy, inifile: pytest.ini, testpaths: tests
plugins: reportportal-1.0.5, logger-0.5.1, metadata-1.8.0, html-1.22.0, marker-bugzilla-0.9.1.dev2
collected 92 items / 91 deselected / 1 selected
tests/ecosystem/deployment/test_ocs_basic_install.py::test_cluster_is_running
-------------------------------- live log setup --------------------------------
16:55:20 - MainThread - tests.conftest - INFO - All logs located at /tmp/ocs-ci-logs-1565880920
16:55:20 - MainThread - ocs_ci.deployment.factory - INFO - Deployment key = aws_ipi
16:55:20 - MainThread - ocs_ci.deployment.factory - INFO - Current deployment platform: AWS,deployment type: ipi
16:55:21 - MainThread - ocs_ci.utility.utils - INFO - Downloading openshift client (4.2.0-0.nightly-2019-08-15-073735).
16:55:24 - MainThread - ocs_ci.utility.utils - INFO - Executing command: tar xzvf openshift-client.tar.gz oc kubectl
16:55:25 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/oc version
16:55:25 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Client version: Client Version: version.Info{Major:"", Minor:"", GitVersion:"v4.2.0-alpha.0-7-g9df6b7ff", GitCommit:"9df6b7ffc43530158edd582701b6c5d511e8c495", GitTreeState:"clean", BuildDate:"2019-08-14T02:34:10Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"11+", GitVersion:"v1.11.0+d4cacc0", GitCommit:"d4cacc0", GitTreeState:"clean", BuildDate:"2019-05-02T11:52:09Z", GoVersion:"go1.10.8", Compiler:"gc", Platform:"linux/amd64"}
16:55:25 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig
16:55:25 - MainThread - ocs_ci.ocs.openshift_ops - WARNING - The kubeconfig file /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig doesn't exist!
16:55:26 - MainThread - ocs_ci.utility.utils - INFO - Downloading openshift installer (4.2.0-0.nightly-2019-08-15-073735).
16:55:59 - MainThread - ocs_ci.utility.utils - INFO - Executing command: tar xzvf openshift-install.tar.gz openshift-install
16:56:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install version
16:56:01 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Installer version: ./bin/openshift-install v4.2.0-201908141819-dirty
built from commit 6463cd57aa458032fa73aab811f2e6de6fa3c6c5
release image registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
16:56:01 - MainThread - ocs_ci.deployment.ocp - INFO - A testing cluster will be deployed and cluster information stored at: /home/ocsqe/data/cluster-2019-08-15.3
16:56:01 - MainThread - ocs_ci.deployment.ocp - INFO - Generating install-config
16:56:01 - MainThread - ocs_ci.deployment.ocp - INFO - Install config:
apiVersion: v1
baseDomain: qe.rh-ocs.com
compute:
- name: worker
platform:
aws:
type: m4.large
replicas: 3
controlPlane:
name: master
platform:
aws:
type: m4.xlarge
replicas: 3
metadata:
creationTimestamp: null
name: 'mbukatov-ocsqe'
networking:
clusterNetwork:
- cidr: 10.128.0.0/14
hostPrefix: 23
machineCIDR: 10.0.0.0/16
networkType: OpenShiftSDN
serviceNetwork:
- 172.30.0.0/16
platform:
aws:
region: us-east-2
pullSecret: ''
16:56:01 - MainThread - ocs_ci.deployment.aws - INFO - Deploying OCP cluster
16:56:01 - MainThread - ocs_ci.deployment.aws - INFO - Openshift-installer will be using loglevel:INFO
16:56:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install create cluster --dir /home/ocsqe/data/cluster-2019-08-15.3 --log-level INFO
17:23:09 - MainThread - ocs_ci.utility.utils - WARNING - Command warning:: level=info msg="Consuming \"Install Config\" from target directory"
level=info msg="Creating infrastructure resources..."
level=info msg="Waiting up to 30m0s for the Kubernetes API at https://api.mbukatov-ocsqe.qe.rh-ocs.com:6443..."
level=info msg="API v1.14.0+ab184f9 up"
level=info msg="Waiting up to 30m0s for bootstrapping to complete..."
level=info msg="Destroying the bootstrap resources..."
level=info msg="Waiting up to 30m0s for the cluster at https://api.mbukatov-ocsqe.qe.rh-ocs.com:6443 to initialize..."
level=info msg="Waiting up to 10m0s for the openshift-console route to be created..."
level=info msg="Install complete!"
level=info msg="To access the cluster as the system:admin user when using 'oc', run 'export KUBECONFIG=/home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig'"
level=info msg="Access the OpenShift web-console here: https://console-openshift-console.apps.mbukatov-ocsqe.qe.rh-ocs.com"
level=info msg="Login to the console with user: kubeadmin, password: STmy2-UYAtu-Vuyyc-9G8gP"
17:23:09 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig
17:23:09 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
17:23:11 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
17:23:11 - MainThread - ocs_ci.deployment.aws - INFO - Worker pattern: mbukatov-ocsqe-r6rmq-worker*
17:23:11 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
17:23:11 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-r6rmq-worker-us-east-2b-t2w5s
17:23:11 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-r6rmq-worker-us-east-2c-tkc6m
17:23:11 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-r6rmq-worker-us-east-2a-64427
17:23:17 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-012600a054211fd0b Instance: i-02e84cd5a4092e6fa
17:23:24 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-0f4102894b04ac7da Instance: i-0aef51d99d2add3df
17:23:31 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-00847ee134f01a655 Instance: i-086a8dd0bda9379dd
17:23:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get CephCluster -o yaml
17:23:32 - MainThread - ocs_ci.deployment.deployment - INFO - Running OCS basic installation
17:23:32 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from common.yaml
17:23:32 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.3/common.yaml -o yaml
17:23:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc label namespace openshift-storage "openshift.io/cluster-monitoring=true"
17:23:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc policy add-role-to-user view system:serviceaccount:openshift-monitoring:prometheus-k8s -n openshift-storage
17:23:44 - MainThread - ocs_ci.ocs.utils - INFO - Applying rook resource from rbac.yaml
17:23:44 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig apply -f /home/ocsqe/data/cluster-2019-08-15.3/rbac.yaml
17:23:47 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
17:24:02 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from operator-openshift.yaml
17:24:02 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.3/operator-openshift.yaml -o yaml
17:24:03 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
17:24:18 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-ceph-operator -n openshift-storage --timeout=120s
17:24:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-discover -n openshift-storage --timeout=120s
17:25:43 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from cluster.yaml
17:25:43 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.3/cluster.yaml -o yaml
17:25:44 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:25:48 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:25:51 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:25:55 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:25:59 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:02 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:06 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:10 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:14 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:17 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:21 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:25 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:28 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:32 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:36 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:39 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:43 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:47 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:54 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:26:58 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:05 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:09 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:12 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:16 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:20 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:24 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:28 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:35 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:39 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:43 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:47 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:51 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:55 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:27:59 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:02 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:06 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:10 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:14 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:18 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:22 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:26 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:30 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
17:28:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
17:28:35 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
17:28:38 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
17:28:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
17:28:46 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
17:28:49 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
17:28:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:28:54 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:28:57 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:05 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:08 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:12 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:16 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:19 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:23 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:27 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:30 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:34 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:38 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:41 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:45 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:49 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:53 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:57 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
17:29:58 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from toolbox.yaml
17:29:58 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.3/toolbox.yaml -o yaml
17:29:59 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
17:30:14 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
17:30:29 - MainThread - ocs_ci.ocs.resources.ocs - INFO - Adding CephFilesystem with name ocsci-cephfs
17:30:29 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig create -f /tmp/CephFilesystem_s3rh1nj -o yaml
17:30:30 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get CephFilesystem ocsci-cephfs -o yaml
17:30:30 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
17:30:34 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
17:30:38 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
17:30:41 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
17:30:45 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
17:30:49 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
17:30:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get CephFileSystem -o yaml
17:30:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get Pod --selector=app=rook-ceph-tools -o yaml
17:30:51 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig get CephFileSystem ocsci-cephfs -o yaml
17:30:52 - MainThread - tests.helpers - INFO - Filesystem ocsci-cephfs got created from Openshift Side
17:30:52 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig rsh rook-ceph-tools-7654d675f7-gnhwv ceph fs ls --format json-pretty
17:30:55 - MainThread - tests.helpers - INFO - FileSystem ocsci-cephfs got created from Ceph Side
17:30:55 - MainThread - ocs_ci.deployment.deployment - INFO - MDS deployment is successful!
17:30:55 - MainThread - ocs_ci.deployment.deployment - INFO - Done creating rook resources, waiting for HEALTH_OK
17:30:55 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-ceph-tools -n openshift-storage --timeout=120s
17:30:56 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage get pod -l 'app=rook-ceph-tools' -o jsonpath='{.items[0].metadata.name}'
17:30:57 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage exec rook-ceph-tools-7654d675f7-gnhwv ceph health
17:30:59 - MainThread - ocs_ci.utility.utils - INFO - HEALTH_OK, install successful.
17:30:59 - MainThread - ocs_ci.deployment.deployment - INFO - Patch gp2 storageclass as non-default
17:30:59 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc patch storageclass gp2 -p '{"metadata": {"annotations":{"storageclass.kubernetes.io/is-default-class":"false"}}}' --request-timeout=120s
-------------------------------- live log call ---------------------------------
17:31:00 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig
17:31:00 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
17:31:00 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
PASSED [100%]
================== 1 passed, 91 deselected in 2140.05 seconds ==================
============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /home/ocsqe/projects/ocs-ci-deploy, inifile: pytest.ini, testpaths: tests
plugins: reportportal-1.0.5, logger-0.5.1, metadata-1.8.0, html-1.22.0, marker-bugzilla-0.9.1.dev2
collected 92 items / 91 deselected / 1 selected
tests/ecosystem/deployment/test_ocs_basic_install.py::test_cluster_is_running
-------------------------------- live log setup --------------------------------
18:51:38 - MainThread - tests.conftest - INFO - All logs located at /tmp/ocs-ci-logs-1565887896
18:51:38 - MainThread - ocs_ci.deployment.factory - INFO - Deployment key = aws_ipi
18:51:38 - MainThread - ocs_ci.deployment.factory - INFO - Current deployment platform: AWS,deployment type: ipi
18:51:38 - MainThread - tests.conftest - INFO - Will teardown cluster because --teardown was provided
18:51:38 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/oc version
18:51:39 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Client version: Client Version: version.Info{Major:"", Minor:"", GitVersion:"v4.2.0-alpha.0-7-g9df6b7ff", GitCommit:"9df6b7ffc43530158edd582701b6c5d511e8c495", GitTreeState:"clean", BuildDate:"2019-08-14T02:34:10Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.0+ab184f9", GitCommit:"ab184f9", GitTreeState:"clean", BuildDate:"2019-08-15T06:58:46Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
OpenShift Version: 4.2.0-0.nightly-2019-08-15-073735
-------------------------------- live log call ---------------------------------
18:51:39 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.3/auth/kubeconfig
18:51:39 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
18:51:40 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
PASSED [100%]
------------------------------ live log teardown -------------------------------
18:51:40 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install version
18:51:40 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Installer version: ./bin/openshift-install v4.2.0-201908141819-dirty
built from commit 6463cd57aa458032fa73aab811f2e6de6fa3c6c5
release image registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
18:51:40 - MainThread - ocs_ci.deployment.ocp - INFO - Destroying the cluster
18:51:40 - MainThread - ocs_ci.deployment.ocp - INFO - Destroying cluster defined in /home/ocsqe/data/cluster-2019-08-15.3
18:51:40 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install destroy cluster --dir /home/ocsqe/data/cluster-2019-08-15.3 --log-level INFO
18:54:24 - MainThread - ocs_ci.utility.utils - WARNING - Command warning:: level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-07d8a988f8ff107f7" id=rtbassoc-076f615f4669a0da2
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-07d8a988f8ff107f7" id=rtb-07d8a988f8ff107f7
level=info msg=Deleted arn="arn:aws:s3:::mbukatov-ocsqe-r6rmq-image-registry-us-east-2-rvelehxwwocgonnk"
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-088d03ea951504609" id=nat-088d03ea951504609
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/net/mbukatov-ocsqe-r6rmq-ext/e26e873680f3e65e" id=net/mbukatov-ocsqe-r6rmq-ext/e26e873680f3e65e
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-03f6a6a5bcbfa092b" id=rtbassoc-060be98875b3b3e8f
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-03f6a6a5bcbfa092b" id=rtb-03f6a6a5bcbfa092b
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:image/ami-02558b81cf00f8fe0" id=ami-02558b81cf00f8fe0
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0023ce2952eb45b8d" id=i-0023ce2952eb45b8d name=mbukatov-ocsqe-r6rmq-master-profile role=mbukatov-ocsqe-r6rmq-master-role
level=info msg=Deleted InstanceProfileName=mbukatov-ocsqe-r6rmq-master-profile arn="arn:aws:iam::861790564636:instance-profile/mbukatov-ocsqe-r6rmq-master-profile" id=i-0023ce2952eb45b8d
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0023ce2952eb45b8d" id=i-0023ce2952eb45b8d
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-04d71f5c1a84991a2" id=nat-04d71f5c1a84991a2
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-055b37e3a4dd2a67c" id=i-055b37e3a4dd2a67c
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:instance/i-086a8dd0bda9379dd" id=i-086a8dd0bda9379dd name=mbukatov-ocsqe-r6rmq-worker-profile role=mbukatov-ocsqe-r6rmq-worker-role
level=info msg=Deleted InstanceProfileName=mbukatov-ocsqe-r6rmq-worker-profile arn="arn:aws:iam::861790564636:instance-profile/mbukatov-ocsqe-r6rmq-worker-profile" id=i-086a8dd0bda9379dd
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-086a8dd0bda9379dd" id=i-086a8dd0bda9379dd
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0aef51d99d2add3df" id=i-0aef51d99d2add3df
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-08ffc26c95b416922" id=i-08ffc26c95b416922
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" classic load balancer=aaba7f9a3bf6f11e997dd06aab04d811 id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7 load balancer=loadbalancer/net/mbukatov-ocsqe-r6rmq-int/81828982bfa55fa4
level=info msg=Deleted NAT gateway=nat-021553d83df5cb96e arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted NAT gateway=nat-04d71f5c1a84991a2 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted NAT gateway=nat-088d03ea951504609 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0cdfd2c492d27f805" id=rtbassoc-0daec9e88824e82f4
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0cdfd2c492d27f805" id=rtb-0cdfd2c492d27f805
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-021553d83df5cb96e" id=nat-021553d83df5cb96e
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-r6rmq-sint/da2e9e42e5ced9b1" id=mbukatov-ocsqe-r6rmq-sint/da2e9e42e5ced9b1
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-02e84cd5a4092e6fa" id=i-02e84cd5a4092e6fa
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-r6rmq-aext/e28e3a6ebf0a719d" id=mbukatov-ocsqe-r6rmq-aext/e28e3a6ebf0a719d
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-088f34282b3ff6f85" id=rtbassoc-08ecaf193b10f0975
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-088f34282b3ff6f85" id=rtbassoc-0b103d923694cbe72
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-088f34282b3ff6f85" id=rtbassoc-07a3de02cd65ebe91
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 record set="SRV _etcd-server-ssl._tcp.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 record set="A api-int.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 public zone=/hostedzone/ZQ6XFE6BKI2L record set="A api.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 record set="A api.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 public zone=/hostedzone/ZQ6XFE6BKI2L record set="A \\052.apps.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 record set="A \\052.apps.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 record set="A etcd-0.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 record set="A etcd-1.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7 record set="A etcd-2.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/ZKV8YTS0DYCO7" id=ZKV8YTS0DYCO7
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-r6rmq-master-role" id=mbukatov-ocsqe-r6rmq-master-role name=mbukatov-ocsqe-r6rmq-master-role policy=mbukatov-ocsqe-r6rmq-master-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-r6rmq-master-role" id=mbukatov-ocsqe-r6rmq-master-role name=mbukatov-ocsqe-r6rmq-master-role
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-r6rmq-worker-role" id=mbukatov-ocsqe-r6rmq-worker-role name=mbukatov-ocsqe-r6rmq-worker-role policy=mbukatov-ocsqe-r6rmq-worker-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-r6rmq-worker-role" id=mbukatov-ocsqe-r6rmq-worker-role name=mbukatov-ocsqe-r6rmq-worker-role
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-r6rmq-cloud-credential-operator-iam-ro-nswqn" id=mbukatov-ocsqe-r6rmq-cloud-credential-operator-iam-ro-nswqn policy=mbukatov-ocsqe-r6rmq-cloud-credential-operator-iam-ro-nswqn-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-r6rmq-cloud-credential-operator-iam-ro-nswqn" id=mbukatov-ocsqe-r6rmq-cloud-credential-operator-iam-ro-nswqn
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-r6rmq-openshift-image-registry-zf6cb" id=mbukatov-ocsqe-r6rmq-openshift-image-registry-zf6cb policy=mbukatov-ocsqe-r6rmq-openshift-image-registry-zf6cb-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-r6rmq-openshift-image-registry-zf6cb" id=mbukatov-ocsqe-r6rmq-openshift-image-registry-zf6cb
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-r6rmq-openshift-ingress-zlhbn" id=mbukatov-ocsqe-r6rmq-openshift-ingress-zlhbn policy=mbukatov-ocsqe-r6rmq-openshift-ingress-zlhbn-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-r6rmq-openshift-ingress-zlhbn" id=mbukatov-ocsqe-r6rmq-openshift-ingress-zlhbn
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-r6rmq-openshift-machine-api-aws-xkhbj" id=mbukatov-ocsqe-r6rmq-openshift-machine-api-aws-xkhbj policy=mbukatov-ocsqe-r6rmq-openshift-machine-api-aws-xkhbj-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-r6rmq-openshift-machine-api-aws-xkhbj" id=mbukatov-ocsqe-r6rmq-openshift-machine-api-aws-xkhbj
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0929e14f0ea3f2cc2" id=subnet-0929e14f0ea3f2cc2
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-096bf7d1766335e69" id=sg-096bf7d1766335e69
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:snapshot/snap-04cdd0bc257cded08" id=snap-04cdd0bc257cded08
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-0ab83e2b98ae48321" id=eipalloc-0ab83e2b98ae48321
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:internet-gateway/igw-015fbf387d84f41ea" id=igw-015fbf387d84f41ea
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-00ac884a45c3e53a7" id=subnet-00ac884a45c3e53a7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:volume/vol-086843571456cf564" id=vol-086843571456cf564
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-r6rmq-aint/7b419d4b0b531f62" id=mbukatov-ocsqe-r6rmq-aint/7b419d4b0b531f62
level=info msg=Deleted NAT gateway=nat-021553d83df5cb96e arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted NAT gateway=nat-04d71f5c1a84991a2 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted NAT gateway=nat-088d03ea951504609 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7 network interface=eni-02eace78220ccc5e5
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7 network interface=eni-02db70f8d08792df7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7 network interface=eni-014126b2739b5a703
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7 network interface=eni-02714c96aeb0d8354
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7 table=rtb-046543f5db5855d4b
level=info msg=Deleted VPC endpoint=vpce-08761c9000f4ac51e arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-09a949b043cc478b0" id=eipalloc-09a949b043cc478b0
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-044a31837a1922df7" id=subnet-044a31837a1922df7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-01d41264bcec73549" id=subnet-01d41264bcec73549
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-0483b0bd56b70d454" id=eipalloc-0483b0bd56b70d454
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-09a65fcbb291aa576" id=subnet-09a65fcbb291aa576
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0e34f8ea6b700b3f4" id=subnet-0e34f8ea6b700b3f4
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-06d9b97a8e1bc2fce" id=sg-06d9b97a8e1bc2fce
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-011a085ee4fe87a7f" id=sg-011a085ee4fe87a7f
level=info msg=Deleted NAT gateway=nat-021553d83df5cb96e arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted NAT gateway=nat-04d71f5c1a84991a2 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted NAT gateway=nat-088d03ea951504609 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-05895ca2d7f5e35a7" id=vpc-05895ca2d7f5e35a7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:dhcp-options/dopt-08ffbdc69159ee362" id=dopt-08ffbdc69159ee362
18:54:24 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
18:54:26 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-0f4102894b04ac7da
18:54:26 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-012600a054211fd0b
18:54:27 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-00847ee134f01a655
================== 1 passed, 91 deselected in 169.58 seconds ===================
cluster channel: stable-4.2
cluster version: 4.2.0-0.nightly-2019-08-15-073735
cluster image: registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
storage namespace openshift-cluster-storage-operator
image quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:59b67cbba2fa28497240d87da39d1488baf53fa5ca5b67622ac9860ef3463d99
* quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:59b67cbba2fa28497240d87da39d1488baf53fa5ca5b67622ac9860ef3463d99
storage namespace openshift-storage
image quay.io/rhceph-dev/cephcsi:latest
* quay.io/rhceph-dev/cephcsi@sha256:115f231ce5767e9732580bbe9f09bd7bc6213101aff54dc3a4b31dbc3959c363
image quay.io/openshift/origin-csi-node-driver-registrar:4.2
* quay.io/openshift/origin-csi-node-driver-registrar@sha256:67506f985756d050ac0632df06c4cd6dba323ad342f0c553e35d54af432c089a
image quay.io/openshift/origin-csi-external-attacher:4.2
* quay.io/openshift/origin-csi-external-attacher@sha256:0adf8ec647862a959055e4f3075c7771b9d0d8a25e47315247531b6ae7edf27b
image quay.io/openshift/origin-csi-external-provisioner:4.2
* quay.io/openshift/origin-csi-external-provisioner@sha256:45fa12aec3e9194842f20a42ac767447ece8d3b9169a7acdc5473ce298a68783
image quay.io/openshift/origin-csi-external-snapshotter:4.2
* quay.io/openshift/origin-csi-external-snapshotter@sha256:43cdf1a7d4262276d3396e88c9790aa03b0144182a120971e4158bee267b9240
image quay.io/rhceph-dev/rhceph-4.0-rhel-8:latest
* quay.io/rhceph-dev/rhceph-4.0-rhel-8@sha256:49e59ad2595d17b22093c598e358e2c3186ec6ee461d1a45de70b504c60c9f51
image quay.io/rhceph-dev/rook:latest
* quay.io/rhceph-dev/rook@sha256:7405147c4a48a6706cc3ae7fde1e4fff0541fa030d23f99ba39b93f6668641e5
============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /home/ocsqe/projects/ocs-ci-deploy, inifile: pytest.ini, testpaths: tests
plugins: reportportal-1.0.5, logger-0.5.1, metadata-1.8.0, html-1.22.0, marker-bugzilla-0.9.1.dev2
collected 92 items / 91 deselected / 1 selected
tests/ecosystem/deployment/test_ocs_basic_install.py::test_cluster_is_running
-------------------------------- live log setup --------------------------------
21:36:18 - MainThread - tests.conftest - INFO - All logs located at /tmp/ocs-ci-logs-1565811377
21:36:18 - MainThread - ocs_ci.deployment.factory - INFO - Deployment key = aws_ipi
21:36:18 - MainThread - ocs_ci.deployment.factory - INFO - Current deployment platform: AWS,deployment type: ipi
21:36:18 - MainThread - ocs_ci.utility.utils - INFO - Downloading openshift client (4.2.0-0.nightly-2019-08-14-112500).
21:36:24 - MainThread - ocs_ci.utility.utils - INFO - Executing command: tar xzvf openshift-client.tar.gz oc kubectl
21:36:25 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/oc version
21:36:25 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Client version: Client Version: version.Info{Major:"", Minor:"", GitVersion:"v4.2.0-alpha.0-7-g9df6b7ff", GitCommit:"9df6b7ffc43530158edd582701b6c5d511e8c495", GitTreeState:"clean", BuildDate:"2019-08-14T02:34:10Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"11+", GitVersion:"v1.11.0+d4cacc0", GitCommit:"d4cacc0", GitTreeState:"clean", BuildDate:"2019-05-02T11:52:09Z", GoVersion:"go1.10.8", Compiler:"gc", Platform:"linux/amd64"}
21:36:25 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig
21:36:25 - MainThread - ocs_ci.ocs.openshift_ops - WARNING - The kubeconfig file /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig doesn't exist!
21:36:26 - MainThread - ocs_ci.utility.utils - INFO - Downloading openshift installer (4.2.0-0.nightly-2019-08-14-112500).
21:36:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: tar xzvf openshift-install.tar.gz openshift-install
21:36:44 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install version
21:36:44 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Installer version: ./bin/openshift-install v4.2.0-201908132200-dirty
built from commit 8f972b45987a32cc91bc61c39a727e9a1224693d
release image registry.svc.ci.openshift.org/ocp/release@sha256:3fc5f7de68e450cde0a0b87973c57cf5ac9d1b409cbf50b0cee711bb8b7727ef
21:36:44 - MainThread - ocs_ci.deployment.ocp - INFO - A testing cluster will be deployed and cluster information stored at: /home/ocsqe/data/cluster-2019-08-14.2
21:36:44 - MainThread - ocs_ci.deployment.ocp - INFO - Generating install-config
21:36:44 - MainThread - ocs_ci.deployment.ocp - INFO - Install config:
apiVersion: v1
baseDomain: qe.rh-ocs.com
compute:
- name: worker
platform:
aws:
type: m4.large
replicas: 3
controlPlane:
name: master
platform:
aws:
type: m4.xlarge
replicas: 3
metadata:
creationTimestamp: null
name: 'mbukatov-ocsqe'
networking:
clusterNetwork:
- cidr: 10.128.0.0/14
hostPrefix: 23
machineCIDR: 10.0.0.0/16
networkType: OpenShiftSDN
serviceNetwork:
- 172.30.0.0/16
platform:
aws:
region: us-east-2
pullSecret: ''
21:36:44 - MainThread - ocs_ci.deployment.aws - INFO - Deploying OCP cluster
21:36:44 - MainThread - ocs_ci.deployment.aws - INFO - Openshift-installer will be using loglevel:INFO
21:36:44 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install create cluster --dir /home/ocsqe/data/cluster-2019-08-14.2 --log-level INFO
22:02:11 - MainThread - ocs_ci.utility.utils - WARNING - Command warning:: level=info msg="Consuming \"Install Config\" from target directory"
level=info msg="Creating infrastructure resources..."
level=info msg="Waiting up to 30m0s for the Kubernetes API at https://api.mbukatov-ocsqe.qe.rh-ocs.com:6443..."
level=info msg="API v1.14.0+73a01f4 up"
level=info msg="Waiting up to 30m0s for bootstrapping to complete..."
level=info msg="Destroying the bootstrap resources..."
level=info msg="Waiting up to 30m0s for the cluster at https://api.mbukatov-ocsqe.qe.rh-ocs.com:6443 to initialize..."
level=info msg="Waiting up to 10m0s for the openshift-console route to be created..."
level=info msg="Install complete!"
level=info msg="To access the cluster as the system:admin user when using 'oc', run 'export KUBECONFIG=/home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig'"
level=info msg="Access the OpenShift web-console here: https://console-openshift-console.apps.mbukatov-ocsqe.qe.rh-ocs.com"
level=info msg="Login to the console with user: kubeadmin, password: ema5W-NioFk-niSkp-nKHY2"
22:02:11 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig
22:02:11 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
22:02:13 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
22:02:13 - MainThread - ocs_ci.deployment.aws - INFO - Worker pattern: mbukatov-ocsqe-njdqh-worker*
22:02:13 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
22:02:13 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-njdqh-worker-us-east-2b-b9n7m
22:02:13 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-njdqh-worker-us-east-2a-wrllb
22:02:13 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-njdqh-worker-us-east-2c-2t9f9
22:02:19 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-04e238d396c06589b Instance: i-0bc53f4839edefb00
22:02:26 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-0a9b5e1b71db47f17 Instance: i-09064e0948359630f
22:02:33 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-0f93f60af1282736a Instance: i-0b1bd7e417d276d2a
22:02:33 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get CephCluster -o yaml
22:02:34 - MainThread - ocs_ci.deployment.deployment - INFO - Running OCS basic installation
22:02:34 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from common.yaml
22:02:34 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-14.2/common.yaml -o yaml
22:02:44 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc label namespace openshift-storage "openshift.io/cluster-monitoring=true"
22:02:45 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc policy add-role-to-user view system:serviceaccount:openshift-monitoring:prometheus-k8s -n openshift-storage
22:02:46 - MainThread - ocs_ci.ocs.utils - INFO - Applying rook resource from rbac.yaml
22:02:46 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig apply -f /home/ocsqe/data/cluster-2019-08-14.2/rbac.yaml
22:02:49 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
22:03:04 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from operator-openshift.yaml
22:03:04 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-14.2/operator-openshift.yaml -o yaml
22:03:05 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
22:03:20 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-ceph-operator -n openshift-storage --timeout=120s
22:03:47 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-discover -n openshift-storage --timeout=120s
22:04:30 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from cluster.yaml
22:04:30 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-14.2/cluster.yaml -o yaml
22:04:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:04:35 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:04:38 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:04:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:04:46 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:04:49 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:04:53 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:04:57 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:00 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:04 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:08 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:12 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:16 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:20 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:24 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:27 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:35 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:39 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:43 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
22:05:44 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
22:05:48 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
22:05:51 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
22:05:55 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
22:05:56 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:05:59 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:03 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:07 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:10 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:14 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:18 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:21 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:25 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:29 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:32 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:36 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:40 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:43 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:47 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:51 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:54 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:06:58 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:07:02 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:07:06 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:07:10 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
22:07:11 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from toolbox.yaml
22:07:11 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-14.2/toolbox.yaml -o yaml
22:07:11 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
22:07:26 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
22:07:41 - MainThread - ocs_ci.ocs.resources.ocs - INFO - Adding CephFilesystem with name ocsci-cephfs
22:07:41 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig create -f /tmp/CephFilesystembriggq5u -o yaml
22:07:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get CephFilesystem ocsci-cephfs -o yaml
22:07:43 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
22:07:46 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
22:07:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
22:07:54 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
22:07:57 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
22:08:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
22:08:02 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get CephFileSystem -o yaml
22:08:03 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get Pod --selector=app=rook-ceph-tools -o yaml
22:08:03 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig get CephFileSystem ocsci-cephfs -o yaml
22:08:04 - MainThread - tests.helpers - INFO - Filesystem ocsci-cephfs got created from Openshift Side
22:08:04 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig rsh rook-ceph-tools-5f5dc75fd5-vzdgz ceph fs ls --format json-pretty
22:08:06 - MainThread - tests.helpers - INFO - FileSystem ocsci-cephfs got created from Ceph Side
22:08:06 - MainThread - ocs_ci.deployment.deployment - INFO - MDS deployment is successful!
22:08:06 - MainThread - ocs_ci.deployment.deployment - INFO - Done creating rook resources, waiting for HEALTH_OK
22:08:06 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-ceph-tools -n openshift-storage --timeout=120s
22:08:07 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage get pod -l 'app=rook-ceph-tools' -o jsonpath='{.items[0].metadata.name}'
22:08:08 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage exec rook-ceph-tools-5f5dc75fd5-vzdgz ceph health
22:08:10 - MainThread - ocs_ci.utility.utils - INFO - HEALTH_OK, install successful.
22:08:10 - MainThread - ocs_ci.deployment.deployment - INFO - Patch gp2 storageclass as non-default
22:08:10 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc patch storageclass gp2 -p '{"metadata": {"annotations":{"storageclass.kubernetes.io/is-default-class":"false"}}}' --request-timeout=120s
-------------------------------- live log call ---------------------------------
22:08:11 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig
22:08:11 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
22:08:12 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
PASSED [100%]
================== 1 passed, 91 deselected in 1914.10 seconds ==================
============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /home/ocsqe/projects/ocs-ci-deploy, inifile: pytest.ini, testpaths: tests
plugins: reportportal-1.0.5, logger-0.5.1, metadata-1.8.0, html-1.22.0, marker-bugzilla-0.9.1.dev2
collected 92 items / 91 deselected / 1 selected
tests/ecosystem/deployment/test_ocs_basic_install.py::test_cluster_is_running
-------------------------------- live log setup --------------------------------
12:57:31 - MainThread - tests.conftest - INFO - All logs located at /tmp/ocs-ci-logs-1565866648
12:57:31 - MainThread - ocs_ci.deployment.factory - INFO - Deployment key = aws_ipi
12:57:31 - MainThread - ocs_ci.deployment.factory - INFO - Current deployment platform: AWS,deployment type: ipi
12:57:31 - MainThread - tests.conftest - INFO - Will teardown cluster because --teardown was provided
12:57:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/oc version
12:57:32 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Client version: Client Version: version.Info{Major:"", Minor:"", GitVersion:"v4.2.0-alpha.0-7-g9df6b7ff", GitCommit:"9df6b7ffc43530158edd582701b6c5d511e8c495", GitTreeState:"clean", BuildDate:"2019-08-14T02:34:10Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.0+73a01f4", GitCommit:"73a01f4", GitTreeState:"clean", BuildDate:"2019-08-14T06:55:55Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
OpenShift Version: 4.2.0-0.nightly-2019-08-14-112500
-------------------------------- live log call ---------------------------------
12:57:32 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-14.2/auth/kubeconfig
12:57:32 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
12:57:32 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
PASSED [100%]
------------------------------ live log teardown -------------------------------
12:57:33 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install version
12:57:34 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Installer version: ./bin/openshift-install v4.2.0-201908132200-dirty
built from commit 8f972b45987a32cc91bc61c39a727e9a1224693d
release image registry.svc.ci.openshift.org/ocp/release@sha256:3fc5f7de68e450cde0a0b87973c57cf5ac9d1b409cbf50b0cee711bb8b7727ef
12:57:34 - MainThread - ocs_ci.deployment.ocp - INFO - Destroying the cluster
12:57:34 - MainThread - ocs_ci.deployment.ocp - INFO - Destroying cluster defined in /home/ocsqe/data/cluster-2019-08-14.2
12:57:34 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install destroy cluster --dir /home/ocsqe/data/cluster-2019-08-14.2 --log-level INFO
13:00:09 - MainThread - ocs_ci.utility.utils - WARNING - Command warning:: level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0bc53f4839edefb00" id=i-0bc53f4839edefb00 name=mbukatov-ocsqe-njdqh-worker-profile role=mbukatov-ocsqe-njdqh-worker-role
level=info msg=Deleted InstanceProfileName=mbukatov-ocsqe-njdqh-worker-profile arn="arn:aws:iam::861790564636:instance-profile/mbukatov-ocsqe-njdqh-worker-profile" id=i-0bc53f4839edefb00
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0bc53f4839edefb00" id=i-0bc53f4839edefb00
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:instance/i-03d8425d6f743c9f5" id=i-03d8425d6f743c9f5 name=mbukatov-ocsqe-njdqh-master-profile role=mbukatov-ocsqe-njdqh-master-role
level=info msg=Deleted InstanceProfileName=mbukatov-ocsqe-njdqh-master-profile arn="arn:aws:iam::861790564636:instance-profile/mbukatov-ocsqe-njdqh-master-profile" id=i-03d8425d6f743c9f5
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-03d8425d6f743c9f5" id=i-03d8425d6f743c9f5
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" classic load balancer=a71004b3dbecd11e9bb960a5ea3855d5 id=vpc-01089a45b46b3fb19
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19 load balancer=loadbalancer/net/mbukatov-ocsqe-njdqh-ext/11598d8b16213dd9
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19 load balancer=loadbalancer/net/mbukatov-ocsqe-njdqh-int/20376eba1425b027
level=info msg=Deleted NAT gateway=nat-0360ef53d364d7fcf arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted NAT gateway=nat-054b8380023710831 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted NAT gateway=nat-0f52ae20e794e1b8f arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/net/mbukatov-ocsqe-njdqh-ext/11598d8b16213dd9" id=net/mbukatov-ocsqe-njdqh-ext/11598d8b16213dd9
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:image/ami-0c8e8fb15decc5c3f" id=ami-0c8e8fb15decc5c3f
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-054b8380023710831" id=nat-054b8380023710831
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-04de0caa805c1ab85" id=i-04de0caa805c1ab85
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0b1bd7e417d276d2a" id=i-0b1bd7e417d276d2a
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/net/mbukatov-ocsqe-njdqh-int/20376eba1425b027" id=net/mbukatov-ocsqe-njdqh-int/20376eba1425b027
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-07354075ddb28fd8b" id=i-07354075ddb28fd8b
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-njdqh-sint/3880a3383aef8175" id=mbukatov-ocsqe-njdqh-sint/3880a3383aef8175
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-00c7ce00383bcc78e" id=rtbassoc-0f5aff12619c2e48b
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-00c7ce00383bcc78e" id=rtb-00c7ce00383bcc78e
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/a71004b3dbecd11e9bb960a5ea3855d5" id=a71004b3dbecd11e9bb960a5ea3855d5
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-njdqh-aint/96ab088d2702c25c" id=mbukatov-ocsqe-njdqh-aint/96ab088d2702c25c
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0faeaca932953c596" id=rtbassoc-00fef5e304bd8694b
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0faeaca932953c596" id=rtbassoc-0bffd91a2a74bb235
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0faeaca932953c596" id=rtbassoc-0f7c66605b20abd4f
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-0f52ae20e794e1b8f" id=nat-0f52ae20e794e1b8f
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0d2b7ff210eccf5cd" id=rtbassoc-05e16efb8dbe32bf7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0d2b7ff210eccf5cd" id=rtb-0d2b7ff210eccf5cd
level=info msg=Deleted arn="arn:aws:s3:::mbukatov-ocsqe-njdqh-image-registry-us-east-2-ttkngvfhhiaopkrl"
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-09064e0948359630f" id=i-09064e0948359630f
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-0360ef53d364d7fcf" id=nat-0360ef53d364d7fcf
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0f7b2d2fd52de99a9" id=rtbassoc-0e29321135be1bb71
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0f7b2d2fd52de99a9" id=rtb-0f7b2d2fd52de99a9
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:snapshot/snap-09ed305b714afa606" id=snap-09ed305b714afa606
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K record set="SRV _etcd-server-ssl._tcp.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K record set="A api-int.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K public zone=/hostedzone/ZQ6XFE6BKI2L record set="A api.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K record set="A api.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K public zone=/hostedzone/ZQ6XFE6BKI2L record set="A \\052.apps.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K record set="A \\052.apps.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K record set="A etcd-0.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K record set="A etcd-1.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K record set="A etcd-2.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2ODK0WCWZYV9K" id=Z2ODK0WCWZYV9K
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-njdqh-master-role" id=mbukatov-ocsqe-njdqh-master-role name=mbukatov-ocsqe-njdqh-master-role policy=mbukatov-ocsqe-njdqh-master-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-njdqh-master-role" id=mbukatov-ocsqe-njdqh-master-role name=mbukatov-ocsqe-njdqh-master-role
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-njdqh-worker-role" id=mbukatov-ocsqe-njdqh-worker-role name=mbukatov-ocsqe-njdqh-worker-role policy=mbukatov-ocsqe-njdqh-worker-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-njdqh-worker-role" id=mbukatov-ocsqe-njdqh-worker-role name=mbukatov-ocsqe-njdqh-worker-role
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-njdqh-cloud-credential-operator-iam-ro-zg2lx" id=mbukatov-ocsqe-njdqh-cloud-credential-operator-iam-ro-zg2lx policy=mbukatov-ocsqe-njdqh-cloud-credential-operator-iam-ro-zg2lx-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-njdqh-cloud-credential-operator-iam-ro-zg2lx" id=mbukatov-ocsqe-njdqh-cloud-credential-operator-iam-ro-zg2lx
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-njdqh-openshift-image-registry-sj6gn" id=mbukatov-ocsqe-njdqh-openshift-image-registry-sj6gn policy=mbukatov-ocsqe-njdqh-openshift-image-registry-sj6gn-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-njdqh-openshift-image-registry-sj6gn" id=mbukatov-ocsqe-njdqh-openshift-image-registry-sj6gn
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-njdqh-openshift-ingress-tpsvj" id=mbukatov-ocsqe-njdqh-openshift-ingress-tpsvj policy=mbukatov-ocsqe-njdqh-openshift-ingress-tpsvj-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-njdqh-openshift-ingress-tpsvj" id=mbukatov-ocsqe-njdqh-openshift-ingress-tpsvj
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-njdqh-openshift-machine-api-aws-xzc9s" id=mbukatov-ocsqe-njdqh-openshift-machine-api-aws-xzc9s policy=mbukatov-ocsqe-njdqh-openshift-machine-api-aws-xzc9s-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-njdqh-openshift-machine-api-aws-xzc9s" id=mbukatov-ocsqe-njdqh-openshift-machine-api-aws-xzc9s
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0dd37195a47bcb738" id=subnet-0dd37195a47bcb738
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-njdqh-aext/12b35961c5305507" id=mbukatov-ocsqe-njdqh-aext/12b35961c5305507
level=info msg=Deleted NAT gateway=nat-0360ef53d364d7fcf arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted NAT gateway=nat-054b8380023710831 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted NAT gateway=nat-0f52ae20e794e1b8f arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19 network interface=eni-05d7a1a6cf30e6235
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19 network interface=eni-042aabeec35d28e16
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19 network interface=eni-0ed019a7802e56c85
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19 network interface=eni-019593ae22100078c
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:internet-gateway/igw-0509c6f2dabfa8a64" id=igw-0509c6f2dabfa8a64
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:volume/vol-06c55ea0510897e71" id=vol-06c55ea0510897e71
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-07dbb97d886b7489d" id=subnet-07dbb97d886b7489d
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:volume/vol-0e1f660a796698a2b" id=vol-0e1f660a796698a2b
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-0f483b2b3c04b37d2" id=sg-0f483b2b3c04b37d2
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-03ad8bec99864e47f" id=subnet-03ad8bec99864e47f
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-0d10c7352eb8130ae" id=eipalloc-0d10c7352eb8130ae
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-01afc51391484941c" id=subnet-01afc51391484941c
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-0255f4a8e4a5f29b4" id=eipalloc-0255f4a8e4a5f29b4
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-0bae53547bb740731" id=eipalloc-0bae53547bb740731
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-060c6b99634552ec5" id=sg-060c6b99634552ec5
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-004457fd4590a826c" id=subnet-004457fd4590a826c
level=info msg=Deleted NAT gateway=nat-0360ef53d364d7fcf arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted NAT gateway=nat-054b8380023710831 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted NAT gateway=nat-0f52ae20e794e1b8f arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19 table=rtb-023786ef91bc811d9
level=info msg=Deleted VPC endpoint=vpce-006bb799203215425 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0a8358b30a7e379bc" id=subnet-0a8358b30a7e379bc
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-0d5f018c3f1f00043" id=sg-0d5f018c3f1f00043
level=info msg=Deleted NAT gateway=nat-0360ef53d364d7fcf arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted NAT gateway=nat-054b8380023710831 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted NAT gateway=nat-0f52ae20e794e1b8f arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-01089a45b46b3fb19" id=vpc-01089a45b46b3fb19
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:dhcp-options/dopt-062f39a4999664b8e" id=dopt-062f39a4999664b8e
13:00:09 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
13:00:10 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-0f93f60af1282736a
13:00:11 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-04e238d396c06589b
13:00:12 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-0a9b5e1b71db47f17
================== 1 passed, 91 deselected in 161.46 seconds ===================
cluster channel: stable-4.2
cluster version: 4.2.0-0.nightly-2019-08-14-112500
cluster image: registry.svc.ci.openshift.org/ocp/release@sha256:3fc5f7de68e450cde0a0b87973c57cf5ac9d1b409cbf50b0cee711bb8b7727ef
storage namespace openshift-cluster-storage-operator
image quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:59b67cbba2fa28497240d87da39d1488baf53fa5ca5b67622ac9860ef3463d99
* quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:59b67cbba2fa28497240d87da39d1488baf53fa5ca5b67622ac9860ef3463d99
storage namespace openshift-storage
image quay.io/cephcsi/cephcsi:canary
* quay.io/cephcsi/cephcsi@sha256:3e7e14040782cb143236e103128d3e3c75caf2a356eb412ef8449d9f9c1448ca
image quay.io/k8scsi/csi-node-driver-registrar:v1.1.0
* quay.io/k8scsi/csi-node-driver-registrar@sha256:13daf82fb99e951a4bff8ae5fc7c17c3a8fe7130be6400990d8f6076c32d4599
image quay.io/k8scsi/csi-attacher:v1.1.1
* quay.io/k8scsi/csi-attacher@sha256:e4db94969e1d463807162a1115192ed70d632a61fbeb3bdc97b40fe9ce78c831
image quay.io/k8scsi/csi-provisioner:v1.2.0
* quay.io/k8scsi/csi-provisioner@sha256:0dffe9a8d39c4fdd49c5dd98ca5611a3f9726c012b082946f630e36988ba9f37
image quay.io/k8scsi/csi-snapshotter:v1.1.0
* quay.io/k8scsi/csi-snapshotter@sha256:a49e0da1af6f2bf717e41ba1eee8b5e6a1cbd66a709dd92cc43fe475fe2589eb
image docker.io/ceph/ceph:v14.2.2-20190722
* docker.io/ceph/ceph@sha256:567fe78d90a63ead11deadc2cbf5a912e42bfcc6ef4b1d6154f4b4fea4019052
image docker.io/rook/ceph:master
* docker.io/rook/ceph@sha256:749515f6d302e80ad51c035f8b6678ae5fd445e2ea592d76627a6f6efb8d8ab8
============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /home/ocsqe/projects/ocs-ci-deploy, inifile: pytest.ini, testpaths: tests
plugins: reportportal-1.0.5, logger-0.5.1, metadata-1.8.0, html-1.22.0, marker-bugzilla-0.9.1.dev2
collected 92 items / 91 deselected / 1 selected
tests/ecosystem/deployment/test_ocs_basic_install.py::test_cluster_is_running
-------------------------------- live log setup --------------------------------
14:29:48 - MainThread - tests.conftest - INFO - All logs located at /tmp/ocs-ci-logs-1565872188
14:29:48 - MainThread - ocs_ci.deployment.factory - INFO - Deployment key = aws_ipi
14:29:48 - MainThread - ocs_ci.deployment.factory - INFO - Current deployment platform: AWS,deployment type: ipi
14:29:49 - MainThread - ocs_ci.utility.utils - INFO - Downloading openshift client (4.2.0-0.nightly-2019-08-15-073735).
14:29:52 - MainThread - ocs_ci.utility.utils - INFO - Executing command: tar xzvf openshift-client.tar.gz oc kubectl
14:29:52 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/oc version
14:29:53 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Client version: Client Version: version.Info{Major:"", Minor:"", GitVersion:"v4.2.0-alpha.0-7-g9df6b7ff", GitCommit:"9df6b7ffc43530158edd582701b6c5d511e8c495", GitTreeState:"clean", BuildDate:"2019-08-14T02:34:10Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"11+", GitVersion:"v1.11.0+d4cacc0", GitCommit:"d4cacc0", GitTreeState:"clean", BuildDate:"2019-05-02T11:52:09Z", GoVersion:"go1.10.8", Compiler:"gc", Platform:"linux/amd64"}
14:29:53 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig
14:29:53 - MainThread - ocs_ci.ocs.openshift_ops - WARNING - The kubeconfig file /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig doesn't exist!
14:29:54 - MainThread - ocs_ci.utility.utils - INFO - Downloading openshift installer (4.2.0-0.nightly-2019-08-15-073735).
14:30:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: tar xzvf openshift-install.tar.gz openshift-install
14:30:03 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install version
14:30:03 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Installer version: ./bin/openshift-install v4.2.0-201908141819-dirty
built from commit 6463cd57aa458032fa73aab811f2e6de6fa3c6c5
release image registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
14:30:03 - MainThread - ocs_ci.deployment.ocp - INFO - A testing cluster will be deployed and cluster information stored at: /home/ocsqe/data/cluster-2019-08-15.2
14:30:03 - MainThread - ocs_ci.deployment.ocp - INFO - Generating install-config
14:30:03 - MainThread - ocs_ci.deployment.ocp - INFO - Install config:
apiVersion: v1
baseDomain: qe.rh-ocs.com
compute:
- name: worker
platform:
aws:
type: m4.large
replicas: 3
controlPlane:
name: master
platform:
aws:
type: m4.xlarge
replicas: 3
metadata:
creationTimestamp: null
name: 'mbukatov-ocsqe'
networking:
clusterNetwork:
- cidr: 10.128.0.0/14
hostPrefix: 23
machineCIDR: 10.0.0.0/16
networkType: OpenShiftSDN
serviceNetwork:
- 172.30.0.0/16
platform:
aws:
region: us-east-2
pullSecret: ''
14:30:03 - MainThread - ocs_ci.deployment.aws - INFO - Deploying OCP cluster
14:30:03 - MainThread - ocs_ci.deployment.aws - INFO - Openshift-installer will be using loglevel:INFO
14:30:03 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install create cluster --dir /home/ocsqe/data/cluster-2019-08-15.2 --log-level INFO
15:01:21 - MainThread - ocs_ci.utility.utils - WARNING - Command warning:: level=info msg="Consuming \"Install Config\" from target directory"
level=info msg="Creating infrastructure resources..."
level=info msg="Waiting up to 30m0s for the Kubernetes API at https://api.mbukatov-ocsqe.qe.rh-ocs.com:6443..."
level=info msg="API v1.14.0+ab184f9 up"
level=info msg="Waiting up to 30m0s for bootstrapping to complete..."
level=info msg="Destroying the bootstrap resources..."
level=info msg="Waiting up to 30m0s for the cluster at https://api.mbukatov-ocsqe.qe.rh-ocs.com:6443 to initialize..."
level=info msg="Waiting up to 10m0s for the openshift-console route to be created..."
level=info msg="Install complete!"
level=info msg="To access the cluster as the system:admin user when using 'oc', run 'export KUBECONFIG=/home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig'"
level=info msg="Access the OpenShift web-console here: https://console-openshift-console.apps.mbukatov-ocsqe.qe.rh-ocs.com"
level=info msg="Login to the console with user: kubeadmin, password: 9AkEN-GjuMs-V84v3-oxvXM"
15:01:21 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig
15:01:21 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
15:01:23 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
15:01:23 - MainThread - ocs_ci.deployment.aws - INFO - Worker pattern: mbukatov-ocsqe-5fxl6-worker*
15:01:23 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
15:01:24 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-5fxl6-worker-us-east-2a-dt9kr
15:01:24 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-5fxl6-worker-us-east-2c-t6pgg
15:01:24 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-5fxl6-worker-us-east-2b-rqkhr
15:01:30 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-003b82814ad4e06e7 Instance: i-026a6d4b5ed6f244a
15:01:36 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-07341df5fa6bb34eb Instance: i-0da008e94cf783eac
15:01:43 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-01ca71a6f86535149 Instance: i-0af30d5da71413dce
15:01:44 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get CephCluster -o yaml
15:01:45 - MainThread - ocs_ci.deployment.deployment - INFO - Running OCS basic installation
15:01:45 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from common.yaml
15:01:45 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.2/common.yaml -o yaml
15:01:54 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc label namespace openshift-storage "openshift.io/cluster-monitoring=true"
15:01:55 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc policy add-role-to-user view system:serviceaccount:openshift-monitoring:prometheus-k8s -n openshift-storage
15:01:56 - MainThread - ocs_ci.ocs.utils - INFO - Applying rook resource from rbac.yaml
15:01:56 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig apply -f /home/ocsqe/data/cluster-2019-08-15.2/rbac.yaml
15:01:59 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
15:02:14 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from operator-openshift.yaml
15:02:14 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.2/operator-openshift.yaml -o yaml
15:02:15 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
15:02:30 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-ceph-operator -n openshift-storage --timeout=120s
15:02:57 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-discover -n openshift-storage --timeout=120s
15:03:39 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from cluster.yaml
15:03:39 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.2/cluster.yaml -o yaml
15:03:41 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:03:44 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:03:48 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:03:52 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:03:55 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:03:59 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:03 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:06 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:10 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:14 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:18 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:22 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:26 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:30 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:33 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:37 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:41 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:45 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:49 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
15:04:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
15:04:54 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
15:04:58 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
15:05:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
15:05:02 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:06 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:09 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:13 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:17 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:20 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:24 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:28 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:35 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:39 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:46 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:53 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:05:57 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:06:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:06:04 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:06:08 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:06:12 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:06:16 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:06:20 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
15:06:21 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from toolbox.yaml
15:06:21 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.2/toolbox.yaml -o yaml
15:06:21 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
15:06:36 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
15:06:51 - MainThread - ocs_ci.ocs.resources.ocs - INFO - Adding CephFilesystem with name ocsci-cephfs
15:06:51 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig create -f /tmp/CephFilesystemvn3yfjqp -o yaml
15:06:52 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get CephFilesystem ocsci-cephfs -o yaml
15:06:53 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
15:06:56 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
15:07:00 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
15:07:04 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
15:07:07 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
15:07:08 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get CephFileSystem -o yaml
15:07:09 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get Pod --selector=app=rook-ceph-tools -o yaml
15:07:10 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig get CephFileSystem ocsci-cephfs -o yaml
15:07:10 - MainThread - tests.helpers - INFO - Filesystem ocsci-cephfs got created from Openshift Side
15:07:10 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig rsh rook-ceph-tools-5f5dc75fd5-7dsw2 ceph fs ls --format json-pretty
15:07:12 - MainThread - tests.helpers - INFO - FileSystem ocsci-cephfs got created from Ceph Side
15:07:12 - MainThread - ocs_ci.deployment.deployment - INFO - MDS deployment is successful!
15:07:12 - MainThread - ocs_ci.deployment.deployment - INFO - Done creating rook resources, waiting for HEALTH_OK
15:07:12 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-ceph-tools -n openshift-storage --timeout=120s
15:07:13 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage get pod -l 'app=rook-ceph-tools' -o jsonpath='{.items[0].metadata.name}'
15:07:14 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage exec rook-ceph-tools-5f5dc75fd5-7dsw2 ceph health
15:07:16 - MainThread - ocs_ci.utility.utils - INFO - HEALTH_OK, install successful.
15:07:16 - MainThread - ocs_ci.deployment.deployment - INFO - Patch gp2 storageclass as non-default
15:07:16 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc patch storageclass gp2 -p '{"metadata": {"annotations":{"storageclass.kubernetes.io/is-default-class":"false"}}}' --request-timeout=120s
-------------------------------- live log call ---------------------------------
15:07:17 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig
15:07:17 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
15:07:17 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
PASSED [100%]
================== 1 passed, 91 deselected in 2249.11 seconds ==================
============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /home/ocsqe/projects/ocs-ci-deploy, inifile: pytest.ini, testpaths: tests
plugins: reportportal-1.0.5, logger-0.5.1, metadata-1.8.0, html-1.22.0, marker-bugzilla-0.9.1.dev2
collected 92 items / 91 deselected / 1 selected
tests/ecosystem/deployment/test_ocs_basic_install.py::test_cluster_is_running
-------------------------------- live log setup --------------------------------
16:18:38 - MainThread - tests.conftest - INFO - All logs located at /tmp/ocs-ci-logs-1565878716
16:18:38 - MainThread - ocs_ci.deployment.factory - INFO - Deployment key = aws_ipi
16:18:38 - MainThread - ocs_ci.deployment.factory - INFO - Current deployment platform: AWS,deployment type: ipi
16:18:38 - MainThread - tests.conftest - INFO - Will teardown cluster because --teardown was provided
16:18:38 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/oc version
16:18:39 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Client version: Client Version: version.Info{Major:"", Minor:"", GitVersion:"v4.2.0-alpha.0-7-g9df6b7ff", GitCommit:"9df6b7ffc43530158edd582701b6c5d511e8c495", GitTreeState:"clean", BuildDate:"2019-08-14T02:34:10Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.0+ab184f9", GitCommit:"ab184f9", GitTreeState:"clean", BuildDate:"2019-08-15T06:58:46Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
OpenShift Version: 4.2.0-0.nightly-2019-08-15-073735
-------------------------------- live log call ---------------------------------
16:18:39 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.2/auth/kubeconfig
16:18:39 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
16:18:40 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
PASSED [100%]
------------------------------ live log teardown -------------------------------
16:18:40 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install version
16:18:40 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Installer version: ./bin/openshift-install v4.2.0-201908141819-dirty
built from commit 6463cd57aa458032fa73aab811f2e6de6fa3c6c5
release image registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
16:18:40 - MainThread - ocs_ci.deployment.ocp - INFO - Destroying the cluster
16:18:40 - MainThread - ocs_ci.deployment.ocp - INFO - Destroying cluster defined in /home/ocsqe/data/cluster-2019-08-15.2
16:18:40 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install destroy cluster --dir /home/ocsqe/data/cluster-2019-08-15.2 --log-level INFO
16:21:30 - MainThread - ocs_ci.utility.utils - WARNING - Command warning:: level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0bc0a54a26677403d" id=rtbassoc-0b8ee4921b0ae07f7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0bc0a54a26677403d" id=rtb-0bc0a54a26677403d
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-06e4f2053231bb3d4" id=nat-06e4f2053231bb3d4
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0f3aef44eec2a2eb4" id=i-0f3aef44eec2a2eb4 name=mbukatov-ocsqe-5fxl6-master-profile role=mbukatov-ocsqe-5fxl6-master-role
level=info msg=Deleted InstanceProfileName=mbukatov-ocsqe-5fxl6-master-profile arn="arn:aws:iam::861790564636:instance-profile/mbukatov-ocsqe-5fxl6-master-profile" id=i-0f3aef44eec2a2eb4
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0f3aef44eec2a2eb4" id=i-0f3aef44eec2a2eb4
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/net/mbukatov-ocsqe-5fxl6-int/b92ae14b1023b312" id=net/mbukatov-ocsqe-5fxl6-int/b92ae14b1023b312
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/net/mbukatov-ocsqe-5fxl6-ext/4e8da13064452ffd" id=net/mbukatov-ocsqe-5fxl6-ext/4e8da13064452ffd
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/a36311e1ebf5b11e9934c0631d3bfb87" id=a36311e1ebf5b11e9934c0631d3bfb87
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-5fxl6-sint/f4cd5c6922cfe888" id=mbukatov-ocsqe-5fxl6-sint/f4cd5c6922cfe888
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:image/ami-0d0730ae52b9a9555" id=ami-0d0730ae52b9a9555
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-0f673e7eebfe807a6" id=nat-0f673e7eebfe807a6
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-08294b7149559319f" id=i-08294b7149559319f
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-07adc2a2394194000" id=rtbassoc-0552d7be62e9e353f
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-07adc2a2394194000" id=rtb-07adc2a2394194000
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0b98e8a77e4432d8f" id=i-0b98e8a77e4432d8f
level=info msg=Deleted arn="arn:aws:s3:::mbukatov-ocsqe-5fxl6-image-registry-us-east-2-geskpqooiuescylc"
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-087fb7c84f3001b47" id=nat-087fb7c84f3001b47
level=info msg=Deleted NAT gateway=nat-06e4f2053231bb3d4 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-0f673e7eebfe807a6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-087fb7c84f3001b47 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:snapshot/snap-0266484fa2a1927ec" id=snap-0266484fa2a1927ec
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0da008e94cf783eac" id=i-0da008e94cf783eac name=mbukatov-ocsqe-5fxl6-worker-profile role=mbukatov-ocsqe-5fxl6-worker-role
level=info msg=Deleted InstanceProfileName=mbukatov-ocsqe-5fxl6-worker-profile arn="arn:aws:iam::861790564636:instance-profile/mbukatov-ocsqe-5fxl6-worker-profile" id=i-0da008e94cf783eac
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0da008e94cf783eac" id=i-0da008e94cf783eac
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-026a6d4b5ed6f244a" id=i-026a6d4b5ed6f244a
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-004138ef93149f293" id=rtbassoc-04ef28135711a68b9
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-004138ef93149f293" id=rtbassoc-03dbb1886b2c50c1d
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-004138ef93149f293" id=rtbassoc-05a74ff5d2a70ff4c
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0af30d5da71413dce" id=i-0af30d5da71413dce
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0b0379b26bf5ebba1" id=rtbassoc-05c15072fe129570b
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0b0379b26bf5ebba1" id=rtb-0b0379b26bf5ebba1
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D record set="SRV _etcd-server-ssl._tcp.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D record set="A api-int.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D public zone=/hostedzone/ZQ6XFE6BKI2L record set="A api.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D record set="A api.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D public zone=/hostedzone/ZQ6XFE6BKI2L record set="A \\052.apps.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D record set="A \\052.apps.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D record set="A etcd-0.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D record set="A etcd-1.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D record set="A etcd-2.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z2L7V92QFW91D" id=Z2L7V92QFW91D
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-5fxl6-master-role" id=mbukatov-ocsqe-5fxl6-master-role name=mbukatov-ocsqe-5fxl6-master-role policy=mbukatov-ocsqe-5fxl6-master-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-5fxl6-master-role" id=mbukatov-ocsqe-5fxl6-master-role name=mbukatov-ocsqe-5fxl6-master-role
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-5fxl6-worker-role" id=mbukatov-ocsqe-5fxl6-worker-role name=mbukatov-ocsqe-5fxl6-worker-role policy=mbukatov-ocsqe-5fxl6-worker-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-5fxl6-worker-role" id=mbukatov-ocsqe-5fxl6-worker-role name=mbukatov-ocsqe-5fxl6-worker-role
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5fxl6-cloud-credential-operator-iam-ro-znd8p" id=mbukatov-ocsqe-5fxl6-cloud-credential-operator-iam-ro-znd8p policy=mbukatov-ocsqe-5fxl6-cloud-credential-operator-iam-ro-znd8p-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5fxl6-cloud-credential-operator-iam-ro-znd8p" id=mbukatov-ocsqe-5fxl6-cloud-credential-operator-iam-ro-znd8p
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5fxl6-openshift-image-registry-hccss" id=mbukatov-ocsqe-5fxl6-openshift-image-registry-hccss policy=mbukatov-ocsqe-5fxl6-openshift-image-registry-hccss-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5fxl6-openshift-image-registry-hccss" id=mbukatov-ocsqe-5fxl6-openshift-image-registry-hccss
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5fxl6-openshift-ingress-tqc9m" id=mbukatov-ocsqe-5fxl6-openshift-ingress-tqc9m policy=mbukatov-ocsqe-5fxl6-openshift-ingress-tqc9m-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5fxl6-openshift-ingress-tqc9m" id=mbukatov-ocsqe-5fxl6-openshift-ingress-tqc9m
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5fxl6-openshift-machine-api-aws-mpqdt" id=mbukatov-ocsqe-5fxl6-openshift-machine-api-aws-mpqdt policy=mbukatov-ocsqe-5fxl6-openshift-machine-api-aws-mpqdt-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5fxl6-openshift-machine-api-aws-mpqdt" id=mbukatov-ocsqe-5fxl6-openshift-machine-api-aws-mpqdt
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:internet-gateway/igw-09f5c54ae33f02aeb" id=igw-09f5c54ae33f02aeb
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0e2a92a3301cd1565" id=subnet-0e2a92a3301cd1565
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0e7b81747fb80e584" id=subnet-0e7b81747fb80e584
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-0b19f4149cdc86173" id=eipalloc-0b19f4149cdc86173
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-5fxl6-aint/625736a935e97191" id=mbukatov-ocsqe-5fxl6-aint/625736a935e97191
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-5fxl6-aext/52ee4e95856bba77" id=mbukatov-ocsqe-5fxl6-aext/52ee4e95856bba77
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0459fd5a3681e2849" id=subnet-0459fd5a3681e2849
level=info msg=Deleted NAT gateway=nat-06e4f2053231bb3d4 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-0f673e7eebfe807a6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-087fb7c84f3001b47 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-0a66fd5d18f924d95" id=sg-0a66fd5d18f924d95
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-02bc4c205645d3e0c" id=eipalloc-02bc4c205645d3e0c
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-0d7ddaffd54502cec" id=eipalloc-0d7ddaffd54502cec
level=info msg=Deleted NAT gateway=nat-06e4f2053231bb3d4 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-0f673e7eebfe807a6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-087fb7c84f3001b47 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206 network interface=eni-055aa29d62b9b76d2
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:volume/vol-0a9e0e50f35879c9b" id=vol-0a9e0e50f35879c9b
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:volume/vol-0d6eba69e00a7d0ed" id=vol-0d6eba69e00a7d0ed
level=info msg=Deleted NAT gateway=nat-06e4f2053231bb3d4 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-0f673e7eebfe807a6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-087fb7c84f3001b47 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206 network interface=eni-0f70e6e3a189e7636
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206 network interface=eni-0e0e4f6abf0b26a94
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206 network interface=eni-03629250ef792ff60
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206 table=rtb-03daf851580cb21a9
level=info msg=Deleted VPC endpoint=vpce-0c9a6b0850219115d arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-0cb8d13b566c970f5" id=sg-0cb8d13b566c970f5
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-0b5ead3894e462593" id=sg-0b5ead3894e462593
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-04e5a58e9d2acfbd4" id=subnet-04e5a58e9d2acfbd4
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-09e7215b179384e7d" id=subnet-09e7215b179384e7d
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-00240b2055926470b" id=subnet-00240b2055926470b
level=info msg=Deleted NAT gateway=nat-06e4f2053231bb3d4 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-0f673e7eebfe807a6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted NAT gateway=nat-087fb7c84f3001b47 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-08cc1a5c876ac1206" id=vpc-08cc1a5c876ac1206
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:dhcp-options/dopt-08c1df151e8a58329" id=dopt-08c1df151e8a58329
16:21:30 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
16:21:32 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-07341df5fa6bb34eb
16:21:32 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-01ca71a6f86535149
16:21:33 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-003b82814ad4e06e7
================== 1 passed, 91 deselected in 175.63 seconds ===================
cluster channel: stable-4.2
cluster version: 4.2.0-0.nightly-2019-08-15-073735
cluster image: registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
storage namespace openshift-cluster-storage-operator
image quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:59b67cbba2fa28497240d87da39d1488baf53fa5ca5b67622ac9860ef3463d99
* quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:59b67cbba2fa28497240d87da39d1488baf53fa5ca5b67622ac9860ef3463d99
storage namespace openshift-storage
image quay.io/cephcsi/cephcsi:canary
* quay.io/cephcsi/cephcsi@sha256:3e7e14040782cb143236e103128d3e3c75caf2a356eb412ef8449d9f9c1448ca
image quay.io/k8scsi/csi-node-driver-registrar:v1.1.0
* quay.io/k8scsi/csi-node-driver-registrar@sha256:13daf82fb99e951a4bff8ae5fc7c17c3a8fe7130be6400990d8f6076c32d4599
image quay.io/k8scsi/csi-attacher:v1.1.1
* quay.io/k8scsi/csi-attacher@sha256:e4db94969e1d463807162a1115192ed70d632a61fbeb3bdc97b40fe9ce78c831
image quay.io/k8scsi/csi-provisioner:v1.2.0
* quay.io/k8scsi/csi-provisioner@sha256:0dffe9a8d39c4fdd49c5dd98ca5611a3f9726c012b082946f630e36988ba9f37
image quay.io/k8scsi/csi-snapshotter:v1.1.0
* quay.io/k8scsi/csi-snapshotter@sha256:a49e0da1af6f2bf717e41ba1eee8b5e6a1cbd66a709dd92cc43fe475fe2589eb
image docker.io/ceph/ceph:v14.2.2-20190722
* docker.io/ceph/ceph@sha256:567fe78d90a63ead11deadc2cbf5a912e42bfcc6ef4b1d6154f4b4fea4019052
image docker.io/rook/ceph:master
* docker.io/rook/ceph@sha256:8f369e032c9fe41e296899824d7f68a553c92995c4e945dd71bd4e486e4fa594
============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /home/ocsqe/projects/ocs-ci-deploy, inifile: pytest.ini, testpaths: tests
plugins: reportportal-1.0.5, logger-0.5.1, metadata-1.8.0, html-1.22.0, marker-bugzilla-0.9.1.dev2
collected 92 items / 91 deselected / 1 selected
tests/ecosystem/deployment/test_ocs_basic_install.py::test_cluster_is_running
-------------------------------- live log setup --------------------------------
13:03:21 - MainThread - tests.conftest - INFO - All logs located at /tmp/ocs-ci-logs-1565867000
13:03:21 - MainThread - ocs_ci.deployment.factory - INFO - Deployment key = aws_ipi
13:03:21 - MainThread - ocs_ci.deployment.factory - INFO - Current deployment platform: AWS,deployment type: ipi
13:03:21 - MainThread - ocs_ci.utility.utils - INFO - Downloading openshift client (4.2.0-0.nightly-2019-08-15-073735).
13:03:26 - MainThread - ocs_ci.utility.utils - INFO - Executing command: tar xzvf openshift-client.tar.gz oc kubectl
13:03:27 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/oc version
13:03:27 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Client version: Client Version: version.Info{Major:"", Minor:"", GitVersion:"v4.2.0-alpha.0-7-g9df6b7ff", GitCommit:"9df6b7ffc43530158edd582701b6c5d511e8c495", GitTreeState:"clean", BuildDate:"2019-08-14T02:34:10Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"11+", GitVersion:"v1.11.0+d4cacc0", GitCommit:"d4cacc0", GitTreeState:"clean", BuildDate:"2019-05-02T11:52:09Z", GoVersion:"go1.10.8", Compiler:"gc", Platform:"linux/amd64"}
13:03:27 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig
13:03:27 - MainThread - ocs_ci.ocs.openshift_ops - WARNING - The kubeconfig file /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig doesn't exist!
13:03:28 - MainThread - ocs_ci.utility.utils - INFO - Downloading openshift installer (4.2.0-0.nightly-2019-08-15-073735).
13:04:01 - MainThread - ocs_ci.utility.utils - INFO - Executing command: tar xzvf openshift-install.tar.gz openshift-install
13:04:02 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install version
13:04:02 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Installer version: ./bin/openshift-install v4.2.0-201908141819-dirty
built from commit 6463cd57aa458032fa73aab811f2e6de6fa3c6c5
release image registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
13:04:02 - MainThread - ocs_ci.deployment.ocp - INFO - A testing cluster will be deployed and cluster information stored at: /home/ocsqe/data/cluster-2019-08-15.1
13:04:02 - MainThread - ocs_ci.deployment.ocp - INFO - Generating install-config
13:04:02 - MainThread - ocs_ci.deployment.ocp - INFO - Install config:
apiVersion: v1
baseDomain: qe.rh-ocs.com
compute:
- name: worker
platform:
aws:
type: m4.large
replicas: 3
controlPlane:
name: master
platform:
aws:
type: m4.xlarge
replicas: 3
metadata:
creationTimestamp: null
name: 'mbukatov-ocsqe'
networking:
clusterNetwork:
- cidr: 10.128.0.0/14
hostPrefix: 23
machineCIDR: 10.0.0.0/16
networkType: OpenShiftSDN
serviceNetwork:
- 172.30.0.0/16
platform:
aws:
region: us-east-2
pullSecret: ''
13:04:02 - MainThread - ocs_ci.deployment.aws - INFO - Deploying OCP cluster
13:04:02 - MainThread - ocs_ci.deployment.aws - INFO - Openshift-installer will be using loglevel:INFO
13:04:02 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install create cluster --dir /home/ocsqe/data/cluster-2019-08-15.1 --log-level INFO
13:29:45 - MainThread - ocs_ci.utility.utils - WARNING - Command warning:: level=info msg="Consuming \"Install Config\" from target directory"
level=info msg="Creating infrastructure resources..."
level=info msg="Waiting up to 30m0s for the Kubernetes API at https://api.mbukatov-ocsqe.qe.rh-ocs.com:6443..."
level=info msg="API v1.14.0+ab184f9 up"
level=info msg="Waiting up to 30m0s for bootstrapping to complete..."
level=info msg="Destroying the bootstrap resources..."
level=info msg="Waiting up to 30m0s for the cluster at https://api.mbukatov-ocsqe.qe.rh-ocs.com:6443 to initialize..."
level=info msg="Waiting up to 10m0s for the openshift-console route to be created..."
level=info msg="Install complete!"
level=info msg="To access the cluster as the system:admin user when using 'oc', run 'export KUBECONFIG=/home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig'"
level=info msg="Access the OpenShift web-console here: https://console-openshift-console.apps.mbukatov-ocsqe.qe.rh-ocs.com"
level=info msg="Login to the console with user: kubeadmin, password: FgjtT-aLyyc-m2PXC-E6JUa"
13:29:45 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig
13:29:45 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
13:29:46 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
13:29:46 - MainThread - ocs_ci.deployment.aws - INFO - Worker pattern: mbukatov-ocsqe-5z4sl-worker*
13:29:46 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
13:29:47 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-5z4sl-worker-us-east-2a-lfx9g
13:29:47 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-5z4sl-worker-us-east-2c-x7sfg
13:29:47 - MainThread - ocs_ci.deployment.aws - INFO - Creating and attaching 100 GB volume to mbukatov-ocsqe-5z4sl-worker-us-east-2b-4n74g
13:29:53 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-0e1e0c492d873607d Instance: i-0f72c960c6d935c75
13:30:00 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-0c677f55be8e0616a Instance: i-0301bd7cb4bc59286
13:30:07 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Attaching volume: vol-0d830df5ba60ff027 Instance: i-0a8c319b093ca5bbd
13:30:07 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get CephCluster -o yaml
13:30:08 - MainThread - ocs_ci.deployment.deployment - INFO - Running OCS basic installation
13:30:08 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from common.yaml
13:30:08 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.1/common.yaml -o yaml
13:30:17 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc label namespace openshift-storage "openshift.io/cluster-monitoring=true"
13:30:18 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc policy add-role-to-user view system:serviceaccount:openshift-monitoring:prometheus-k8s -n openshift-storage
13:30:19 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
13:30:34 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from operator-openshift.yaml
13:30:34 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.1/operator-openshift.yaml -o yaml
13:30:35 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
13:30:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-ceph-operator -n openshift-storage --timeout=120s
13:31:14 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-discover -n openshift-storage --timeout=120s
13:31:59 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from cluster.yaml
13:31:59 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.1/cluster.yaml -o yaml
13:32:00 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:04 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:07 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:11 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:15 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:18 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:22 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:26 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:29 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:33 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:37 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:41 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:45 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:49 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:53 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:32:56 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:33:00 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:33:04 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mon -o yaml
13:33:05 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
13:33:09 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
13:33:12 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
13:33:16 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
13:33:20 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mgr -o yaml
13:33:20 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:24 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:28 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:35 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:39 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:46 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:49 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:53 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:33:57 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:00 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:04 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:08 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:11 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:15 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:19 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:22 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:26 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:30 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:34 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-osd -o yaml
13:34:35 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from toolbox.yaml
13:34:35 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.1/toolbox.yaml -o yaml
13:34:35 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
13:34:50 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from service-monitor.yaml
13:34:50 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.1/service-monitor.yaml -o yaml
13:34:51 - MainThread - ocs_ci.ocs.utils - INFO - Creating rook resource from prometheus-rules.yaml
13:34:51 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig create -f /home/ocsqe/data/cluster-2019-08-15.1/prometheus-rules.yaml -o yaml
13:34:52 - MainThread - ocs_ci.deployment.deployment - INFO - Waiting 15 seconds...
13:35:07 - MainThread - ocs_ci.ocs.resources.ocs - INFO - Adding CephFilesystem with name ocsci-cephfs
13:35:07 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig create -f /tmp/CephFilesystemmlkneg_x -o yaml
13:35:07 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get CephFilesystem ocsci-cephfs -o yaml
13:35:08 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
13:35:12 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
13:35:15 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
13:35:19 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
13:35:23 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
13:35:26 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-mds -o yaml
13:35:27 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get CephFileSystem -o yaml
13:35:28 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get Pod --selector=app=rook-ceph-tools -o yaml
13:35:29 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig get CephFileSystem ocsci-cephfs -o yaml
13:35:29 - MainThread - tests.helpers - INFO - Filesystem ocsci-cephfs got created from Openshift Side
13:35:29 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage --kubeconfig /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig rsh rook-ceph-tools-5f5dc75fd5-tvprt ceph fs ls --format json-pretty
13:35:31 - MainThread - tests.helpers - INFO - FileSystem ocsci-cephfs got created from Ceph Side
13:35:31 - MainThread - ocs_ci.deployment.deployment - INFO - MDS deployment is successful!
13:35:31 - MainThread - ocs_ci.deployment.deployment - INFO - Done creating rook resources, waiting for HEALTH_OK
13:35:31 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc wait --for condition=ready pod -l app=rook-ceph-tools -n openshift-storage --timeout=120s
13:35:32 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage get pod -l 'app=rook-ceph-tools' -o jsonpath='{.items[0].metadata.name}'
13:35:33 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc -n openshift-storage exec rook-ceph-tools-5f5dc75fd5-tvprt ceph health
13:35:35 - MainThread - ocs_ci.utility.utils - INFO - HEALTH_OK, install successful.
13:35:35 - MainThread - ocs_ci.deployment.deployment - INFO - Patch gp2 storageclass as non-default
13:35:35 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc patch storageclass gp2 -p '{"metadata": {"annotations":{"storageclass.kubernetes.io/is-default-class":"false"}}}' --request-timeout=120s
-------------------------------- live log call ---------------------------------
13:35:35 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig
13:35:36 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
13:35:36 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
PASSED [100%]
================== 1 passed, 91 deselected in 1935.31 seconds ==================
============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /home/ocsqe/projects/ocs-ci-deploy, inifile: pytest.ini, testpaths: tests
plugins: reportportal-1.0.5, logger-0.5.1, metadata-1.8.0, html-1.22.0, marker-bugzilla-0.9.1.dev2
collected 92 items / 91 deselected / 1 selected
tests/ecosystem/deployment/test_ocs_basic_install.py::test_cluster_is_running
-------------------------------- live log setup --------------------------------
14:22:41 - MainThread - tests.conftest - INFO - All logs located at /tmp/ocs-ci-logs-1565871759
14:22:41 - MainThread - ocs_ci.deployment.factory - INFO - Deployment key = aws_ipi
14:22:41 - MainThread - ocs_ci.deployment.factory - INFO - Current deployment platform: AWS,deployment type: ipi
14:22:41 - MainThread - tests.conftest - INFO - Will teardown cluster because --teardown was provided
14:22:41 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/oc version
14:22:42 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Client version: Client Version: version.Info{Major:"", Minor:"", GitVersion:"v4.2.0-alpha.0-7-g9df6b7ff", GitCommit:"9df6b7ffc43530158edd582701b6c5d511e8c495", GitTreeState:"clean", BuildDate:"2019-08-14T02:34:10Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.0+ab184f9", GitCommit:"ab184f9", GitTreeState:"clean", BuildDate:"2019-08-15T06:58:46Z", GoVersion:"go1.12.6", Compiler:"gc", Platform:"linux/amd64"}
OpenShift Version: 4.2.0-0.nightly-2019-08-15-073735
-------------------------------- live log call ---------------------------------
14:22:42 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Testing access to cluster with /home/ocsqe/data/cluster-2019-08-15.1/auth/kubeconfig
14:22:42 - MainThread - ocs_ci.utility.utils - INFO - Executing command: oc cluster-info
14:22:43 - MainThread - ocs_ci.ocs.openshift_ops - INFO - Access to cluster is OK!
PASSED [100%]
------------------------------ live log teardown -------------------------------
14:22:43 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install version
14:22:43 - MainThread - ocs_ci.utility.utils - INFO - OpenShift Installer version: ./bin/openshift-install v4.2.0-201908141819-dirty
built from commit 6463cd57aa458032fa73aab811f2e6de6fa3c6c5
release image registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
14:22:43 - MainThread - ocs_ci.deployment.ocp - INFO - Destroying the cluster
14:22:43 - MainThread - ocs_ci.deployment.ocp - INFO - Destroying cluster defined in /home/ocsqe/data/cluster-2019-08-15.1
14:22:43 - MainThread - ocs_ci.utility.utils - INFO - Executing command: ./bin/openshift-install destroy cluster --dir /home/ocsqe/data/cluster-2019-08-15.1 --log-level INFO
14:25:30 - MainThread - ocs_ci.utility.utils - WARNING - Command warning:: level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-025c5c0c1f137f0d2" id=rtbassoc-045c909d5c63b09d7
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-025c5c0c1f137f0d2" id=rtbassoc-036987e4661936320
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-025c5c0c1f137f0d2" id=rtbassoc-0f186835302d913bd
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-06902abd55233b4dd" id=nat-06902abd55233b4dd
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/net/mbukatov-ocsqe-5z4sl-int/8a1f7091c3b41a00" id=net/mbukatov-ocsqe-5z4sl-int/8a1f7091c3b41a00
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:image/ami-031a9e060dd25431d" id=ami-031a9e060dd25431d
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0ecc8e910e2a30256" id=i-0ecc8e910e2a30256 name=mbukatov-ocsqe-5z4sl-master-profile role=mbukatov-ocsqe-5z4sl-master-role
level=info msg=Deleted InstanceProfileName=mbukatov-ocsqe-5z4sl-master-profile arn="arn:aws:iam::861790564636:instance-profile/mbukatov-ocsqe-5z4sl-master-profile" id=i-0ecc8e910e2a30256
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0ecc8e910e2a30256" id=i-0ecc8e910e2a30256
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-091ceed7c2ebb1513" id=rtbassoc-001fbdeb2ab5d0124
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-091ceed7c2ebb1513" id=rtb-091ceed7c2ebb1513
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:loadbalancer/net/mbukatov-ocsqe-5z4sl-ext/ddbc9f00b6ca309f" id=net/mbukatov-ocsqe-5z4sl-ext/ddbc9f00b6ca309f
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-08985d7451c649e81" id=i-08985d7451c649e81
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" classic load balancer=aee0155f1bf4e11e9a900026f5c12513 id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-06902abd55233b4dd arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-00df4d111acd0b829 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-081142b09abf980e6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted arn="arn:aws:s3:::mbukatov-ocsqe-5z4sl-image-registry-us-east-2-ykdiiiggsckmcfiu"
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0cb55d801845d2d99" id=rtbassoc-0da05b0493bc69f73
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-0cb55d801845d2d99" id=rtb-0cb55d801845d2d99
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:snapshot/snap-00af7ca8196c9ba4c" id=snap-00af7ca8196c9ba4c
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0301bd7cb4bc59286" id=i-0301bd7cb4bc59286 name=mbukatov-ocsqe-5z4sl-worker-profile role=mbukatov-ocsqe-5z4sl-worker-role
level=info msg=Deleted InstanceProfileName=mbukatov-ocsqe-5z4sl-worker-profile arn="arn:aws:iam::861790564636:instance-profile/mbukatov-ocsqe-5z4sl-worker-profile" id=i-0301bd7cb4bc59286
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0301bd7cb4bc59286" id=i-0301bd7cb4bc59286
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-081142b09abf980e6" id=nat-081142b09abf980e6
level=info msg=Disassociated arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-046c701defef22b44" id=rtbassoc-0c51a882ee5950533
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:route-table/rtb-046c701defef22b44" id=rtb-046c701defef22b44
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0f72c960c6d935c75" id=i-0f72c960c6d935c75
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:natgateway/nat-00df4d111acd0b829" id=nat-00df4d111acd0b829
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0cf57a7b6e5efb68f" id=i-0cf57a7b6e5efb68f
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:instance/i-0a8c319b093ca5bbd" id=i-0a8c319b093ca5bbd
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 record set="SRV _etcd-server-ssl._tcp.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 record set="A api-int.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 public zone=/hostedzone/ZQ6XFE6BKI2L record set="A api.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 record set="A api.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 public zone=/hostedzone/ZQ6XFE6BKI2L record set="A \\052.apps.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 record set="A \\052.apps.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 record set="A etcd-0.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 record set="A etcd-1.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3 record set="A etcd-2.mbukatov-ocsqe.qe.rh-ocs.com."
level=info msg=Deleted arn="arn:aws:route53:::hostedzone/Z1S5D9BR65WBQ3" id=Z1S5D9BR65WBQ3
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-5z4sl-master-role" id=mbukatov-ocsqe-5z4sl-master-role name=mbukatov-ocsqe-5z4sl-master-role policy=mbukatov-ocsqe-5z4sl-master-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-5z4sl-master-role" id=mbukatov-ocsqe-5z4sl-master-role name=mbukatov-ocsqe-5z4sl-master-role
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-5z4sl-worker-role" id=mbukatov-ocsqe-5z4sl-worker-role name=mbukatov-ocsqe-5z4sl-worker-role policy=mbukatov-ocsqe-5z4sl-worker-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:role/mbukatov-ocsqe-5z4sl-worker-role" id=mbukatov-ocsqe-5z4sl-worker-role name=mbukatov-ocsqe-5z4sl-worker-role
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5z4sl-cloud-credential-operator-iam-ro-svd5x" id=mbukatov-ocsqe-5z4sl-cloud-credential-operator-iam-ro-svd5x policy=mbukatov-ocsqe-5z4sl-cloud-credential-operator-iam-ro-svd5x-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5z4sl-cloud-credential-operator-iam-ro-svd5x" id=mbukatov-ocsqe-5z4sl-cloud-credential-operator-iam-ro-svd5x
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5z4sl-openshift-image-registry-7xrgc" id=mbukatov-ocsqe-5z4sl-openshift-image-registry-7xrgc policy=mbukatov-ocsqe-5z4sl-openshift-image-registry-7xrgc-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5z4sl-openshift-image-registry-7xrgc" id=mbukatov-ocsqe-5z4sl-openshift-image-registry-7xrgc
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5z4sl-openshift-ingress-6l9vq" id=mbukatov-ocsqe-5z4sl-openshift-ingress-6l9vq policy=mbukatov-ocsqe-5z4sl-openshift-ingress-6l9vq-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5z4sl-openshift-ingress-6l9vq" id=mbukatov-ocsqe-5z4sl-openshift-ingress-6l9vq
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5z4sl-openshift-machine-api-aws-ckff2" id=mbukatov-ocsqe-5z4sl-openshift-machine-api-aws-ckff2 policy=mbukatov-ocsqe-5z4sl-openshift-machine-api-aws-ckff2-policy
level=info msg=Deleted arn="arn:aws:iam::861790564636:user/mbukatov-ocsqe-5z4sl-openshift-machine-api-aws-ckff2" id=mbukatov-ocsqe-5z4sl-openshift-machine-api-aws-ckff2
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-5z4sl-sint/469dc2e79b10b941" id=mbukatov-ocsqe-5z4sl-sint/469dc2e79b10b941
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-5z4sl-aint/2bd3d5520752dda8" id=mbukatov-ocsqe-5z4sl-aint/2bd3d5520752dda8
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0559c5045cfe87c09" id=subnet-0559c5045cfe87c09
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-055e1bf147e889c8f" id=eipalloc-055e1bf147e889c8f
level=info msg=Deleted arn="arn:aws:elasticloadbalancing:us-east-2:861790564636:targetgroup/mbukatov-ocsqe-5z4sl-aext/e8efd37d299de2c7" id=mbukatov-ocsqe-5z4sl-aext/e8efd37d299de2c7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0d3e3806182fe4b60" id=subnet-0d3e3806182fe4b60
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0e43527b9c7c37e61" id=subnet-0e43527b9c7c37e61
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-01a4e5724cc842641" id=eipalloc-01a4e5724cc842641
level=info msg=Released arn="arn:aws:ec2:us-east-2:861790564636:elastic-ip/eipalloc-0df61f20579ac1b52" id=eipalloc-0df61f20579ac1b52
level=info msg=Deleted NAT gateway=nat-06902abd55233b4dd arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-00df4d111acd0b829 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-081142b09abf980e6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:internet-gateway/igw-0dd38577a3c24134c" id=igw-0dd38577a3c24134c
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-017f3595aab0b9926" id=sg-017f3595aab0b9926
level=info msg=Deleted NAT gateway=nat-06902abd55233b4dd arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-00df4d111acd0b829 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-081142b09abf980e6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:volume/vol-04e15980f2fb6a19b" id=vol-04e15980f2fb6a19b
level=info msg=Deleted NAT gateway=nat-06902abd55233b4dd arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-00df4d111acd0b829 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-081142b09abf980e6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57 network interface=eni-04b9b016bac5fa4f8
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57 network interface=eni-095bac1ce491d8798
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57 network interface=eni-0a86db4c3cc331c4d
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57 table=rtb-0ab9f51e6380b50a7
level=info msg=Deleted VPC endpoint=vpce-09dc3e83c5e64d51f arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-01af77ecfa254b351" id=sg-01af77ecfa254b351
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-027a8d44c58974fde" id=subnet-027a8d44c58974fde
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-09bf5a82ae25b978e" id=subnet-09bf5a82ae25b978e
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:subnet/subnet-0d0d14c7fc9e398b7" id=subnet-0d0d14c7fc9e398b7
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:security-group/sg-08a089d0d3306f8de" id=sg-08a089d0d3306f8de
level=info msg=Deleted NAT gateway=nat-06902abd55233b4dd arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-00df4d111acd0b829 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted NAT gateway=nat-081142b09abf980e6 arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:vpc/vpc-0c5a179df7b273e57" id=vpc-0c5a179df7b273e57
level=info msg=Deleted arn="arn:aws:ec2:us-east-2:861790564636:dhcp-options/dopt-00d1f92333ae2fcb8" id=dopt-00d1f92333ae2fcb8
14:25:30 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
14:25:31 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-0c677f55be8e0616a
14:25:32 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-0d830df5ba60ff027
14:25:32 - MainThread - /home/ocsqe/projects/ocs-ci-deploy/ocs_ci/utility/aws.py - INFO - Deleting volume: vol-0e1e0c492d873607d
================== 1 passed, 91 deselected in 172.06 seconds ===================
cluster channel: stable-4.2
cluster version: 4.2.0-0.nightly-2019-08-15-073735
cluster image: registry.svc.ci.openshift.org/ocp/release@sha256:aef2fb3071accdb07f8eb7d7fa90d1814892b73902ee11136023dfd77037398a
storage namespace openshift-cluster-storage-operator
image quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:59b67cbba2fa28497240d87da39d1488baf53fa5ca5b67622ac9860ef3463d99
* quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:59b67cbba2fa28497240d87da39d1488baf53fa5ca5b67622ac9860ef3463d99
storage namespace openshift-storage
image quay.io/cephcsi/cephcsi:canary
* quay.io/cephcsi/cephcsi@sha256:3e7e14040782cb143236e103128d3e3c75caf2a356eb412ef8449d9f9c1448ca
image quay.io/k8scsi/csi-node-driver-registrar:v1.1.0
* quay.io/k8scsi/csi-node-driver-registrar@sha256:13daf82fb99e951a4bff8ae5fc7c17c3a8fe7130be6400990d8f6076c32d4599
image quay.io/k8scsi/csi-attacher:v1.1.1
* quay.io/k8scsi/csi-attacher@sha256:e4db94969e1d463807162a1115192ed70d632a61fbeb3bdc97b40fe9ce78c831
image quay.io/k8scsi/csi-provisioner:v1.2.0
* quay.io/k8scsi/csi-provisioner@sha256:0dffe9a8d39c4fdd49c5dd98ca5611a3f9726c012b082946f630e36988ba9f37
image quay.io/k8scsi/csi-snapshotter:v1.1.0
* quay.io/k8scsi/csi-snapshotter@sha256:a49e0da1af6f2bf717e41ba1eee8b5e6a1cbd66a709dd92cc43fe475fe2589eb
image docker.io/ceph/ceph:v14.2.2-20190722
* docker.io/ceph/ceph@sha256:567fe78d90a63ead11deadc2cbf5a912e42bfcc6ef4b1d6154f4b4fea4019052
image docker.io/rook/ceph:master
* docker.io/rook/ceph@sha256:8f369e032c9fe41e296899824d7f68a553c92995c4e945dd71bd4e486e4fa594
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment