Skip to content

Instantly share code, notes, and snippets.

@surajssd
Created May 5, 2020 15:53
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save surajssd/9ff214655e4f17c1310c6cbd8b0f0ce7 to your computer and use it in GitHub Desktop.
Save surajssd/9ff214655e4f17c1310c6cbd8b0f0ce7 to your computer and use it in GitHub Desktop.
This file has been truncated, but you can view the full file.
$ ./hyperkube kubelet --node-ip=10.88.81.5 --anonymous-auth=false --authentication-token-webhook --authorization-mode=Webhook --client-ca-file=/etc/kubernetes/ca.crt --cluster_dns=10.3.0.10 --cluster_domain=cluster.local --cni-conf-dir=/etc/kubernetes/cni/net.d --config=/etc/kubernetes/kubelet.config --kubeconfig=/etc/kubernetes/kubeconfig --lock-file=/var/run/lock/kubelet.lock --network-plugin=cni --pod-manifest-path=/etc/kubernetes/manifests --read-only-port=0 --volume-plugin-dir=/var/lib/kubelet/volumeplugins --node-labels=node.kubernetes.io/node,
metallb.universe.tf/my-asn=65000,metallb.universe.tf/peer-asn=65530 --register-with-taints= --address=10.88.81.5
Flag --anonymous-auth has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --authentication-token-webhook has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --authorization-mode has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --client-ca-file has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --cluster-dns has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --cluster-domain has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --pod-manifest-path has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --read-only-port has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --anonymous-auth has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --authentication-token-webhook has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --authorization-mode has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --client-ca-file has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --cluster-dns has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --cluster-domain has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --pod-manifest-path has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Flag --read-only-port has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
I0505 07:05:26.005609 2827 server.go:417] Version: v1.18.2
I0505 07:05:26.005807 2827 server.go:495] acquiring file lock on "/var/run/lock/kubelet.lock"
I0505 07:05:26.023799 2827 plugins.go:100] No cloud provider specified.
I0505 07:05:26.056201 2827 server.go:646] --cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /
I0505 07:05:26.056610 2827 container_manager_linux.go:266] container manager verified user specified cgroup-root exists: []
I0505 07:05:26.056622 2827 container_manager_linux.go:271] Creating Container Manager object based on Node Config: {RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: ContainerRuntime:docker CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:cgroupfs KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:imagefs.available Operator:LessThan Value:{Quantity:<nil> Percentage:0.15} GracePeriod:0s MinReclaim:<nil>} {Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:<nil>} {Signal:nodefs.available Operator:LessThan Value:{Quantity:<nil> Percentage:0.1} GracePeriod:0s MinReclaim:<nil>} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity:<nil> Percentage:0.05} GracePeriod:0s MinReclaim:<nil>}]} QOSReserved:map[] ExperimentalCPUManagerPolicy:none ExperimentalCPUManagerReconcilePeriod:10s ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none}
I0505 07:05:26.056808 2827 topology_manager.go:126] [topologymanager] Creating topology manager with none policy
I0505 07:05:26.056819 2827 container_manager_linux.go:301] [topologymanager] Initializing Topology Manager with none policy
I0505 07:05:26.056828 2827 container_manager_linux.go:306] Creating device plugin manager: true
I0505 07:05:26.056962 2827 client.go:75] Connecting to docker on unix:///var/run/docker.sock
I0505 07:05:26.056973 2827 client.go:92] Start docker client with request timeout=2m0s
W0505 07:05:26.058338 2827 docker_service.go:561] Hairpin mode set to "promiscuous-bridge" but kubenet is not enabled, falling back to "hairpin-veth"
I0505 07:05:26.058373 2827 docker_service.go:238] Hairpin mode set to "hairpin-veth"
W0505 07:05:26.058544 2827 cni.go:237] Unable to update cni config: no networks found in /etc/kubernetes/cni/net.d
W0505 07:05:26.063210 2827 cni.go:237] Unable to update cni config: no networks found in /etc/kubernetes/cni/net.d
I0505 07:05:26.063285 2827 docker_service.go:253] Docker cri networking managed by cni
W0505 07:05:26.063443 2827 cni.go:237] Unable to update cni config: no networks found in /etc/kubernetes/cni/net.d
W0505 07:05:26.063480 2827 cni.go:237] Unable to update cni config: no networks found in /etc/kubernetes/cni/net.d
I0505 07:05:26.073244 2827 docker_service.go:258] Docker Info: &{ID:IR3I:VKJF:RCWR:ZMVK:IPUG:UR5K:KLOS:UHB6:JBPL:FOXG:MKTH:AWHO Containers:4 ContainersRunning:4 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Native Overlay Diff true]] SystemStatus:[] Plugins:{Volume:[local] Network:[bridge host macvlan null overlay] Authorization:[] Log:[awslogs fluentd gcplogs gelf journald json-file logentries splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:true KernelMemoryTCP:false CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:false IPv4Forwarding:true BridgeNfIptables:true BridgeNfIP6tables:true Debug:false NFd:51 OomKillDisable:true NGoroutines:81 SystemTime:2020-05-05T07:05:26.064727855Z LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:4.19.84-flatcar OperatingSystem:Flatcar Container Linux by Kinvolk 2247.7.0 (Rhyolite) OSType:linux Architecture:x86_64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:0xc00076b810 NCPU:32 MemTotal:135146483712 GenericResources:[] DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:suraj-lk-cluster-pool-1-worker-2 Labels:[] ExperimentalBuild:false ServerVersion:18.06.3-ce ClusterStore: ClusterAdvertise: Runtimes:map[runc:{Path:docker-runc Args:[]}] DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:[] Nodes:0 Managers:0 Cluster:<nil> Warnings:[]} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:468a545b9edcd5932818eb9de8e72413e616e86e Expected:468a545b9edcd5932818eb9de8e72413e616e86e} RuncCommit:{ID:a592beb5bc4c4092b1b1bac971afed27687340c5 Expected:a592beb5bc4c4092b1b1bac971afed27687340c5} InitCommit:{ID:fec3683b971d9c3ef73f284f176672c44b448662 Expected:fec3683b971d9c3ef73f284f176672c44b448662} SecurityOptions:[name=seccomp,profile=default name=selinux] ProductLicense: Warnings:[]}
I0505 07:05:26.073327 2827 docker_service.go:271] Setting cgroupDriver to cgroupfs
I0505 07:05:26.084230 2827 remote_runtime.go:59] parsed scheme: ""
I0505 07:05:26.084246 2827 remote_runtime.go:59] scheme "" not registered, fallback to default scheme
I0505 07:05:26.084276 2827 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{/var/run/dockershim.sock <nil> 0 <nil>}] <nil> <nil>}
I0505 07:05:26.084286 2827 clientconn.go:933] ClientConn switching balancer to "pick_first"
I0505 07:05:26.084346 2827 remote_image.go:50] parsed scheme: ""
I0505 07:05:26.084354 2827 remote_image.go:50] scheme "" not registered, fallback to default scheme
I0505 07:05:26.084363 2827 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{/var/run/dockershim.sock <nil> 0 <nil>}] <nil> <nil>}
I0505 07:05:26.084378 2827 clientconn.go:933] ClientConn switching balancer to "pick_first"
I0505 07:05:26.084408 2827 kubelet.go:292] Adding pod path: /etc/kubernetes/manifests
I0505 07:05:26.084465 2827 kubelet.go:317] Watching apiserver
W0505 07:05:31.063744 2827 cni.go:237] Unable to update cni config: no networks found in /etc/kubernetes/cni/net.d
E0505 07:05:32.347578 2827 aws_credentials.go:77] while getting AWS credentials NoCredentialProviders: no valid providers in chain. Deprecated.
For verbose messaging see aws.Config.CredentialsChainVerboseErrors
I0505 07:05:32.348772 2827 kuberuntime_manager.go:211] Container runtime docker initialized, version: 18.06.3-ce, apiVersion: 1.38.0
I0505 07:05:32.349330 2827 server.go:1125] Started kubelet
I0505 07:05:32.349367 2827 server.go:145] Starting to listen on 0.0.0.0:10250
E0505 07:05:32.350077 2827 kubelet.go:1305] Image garbage collection failed once. Stats initialization may not have completed yet: failed to get imageFs info: unable to find data in memory cache
E0505 07:05:32.350102 2827 kubelet.go:1301] Image garbage collection failed multiple times in a row: failed to get imageFs info: unable to find data in memory cache
I0505 07:05:32.352795 2827 server.go:393] Adding debug handlers to kubelet server.
I0505 07:05:32.352824 2827 fs_resource_analyzer.go:64] Starting FS ResourceAnalyzer
I0505 07:05:32.352896 2827 volume_manager.go:265] Starting Kubelet Volume Manager
I0505 07:05:32.352940 2827 desired_state_of_world_populator.go:139] Desired state populator starts to run
E0505 07:05:32.353823 2827 kubelet.go:2187] Container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized
I0505 07:05:32.368910 2827 status_manager.go:158] Starting to sync pod status with apiserver
I0505 07:05:32.368935 2827 kubelet.go:1821] Starting kubelet main sync loop.
E0505 07:05:32.368976 2827 kubelet.go:1845] skipping pod synchronization - [container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]
I0505 07:05:32.453031 2827 kuberuntime_manager.go:978] updating runtime config through cri with podcidr 10.2.2.0/24
I0505 07:05:32.453035 2827 kubelet_node_status.go:294] Setting node annotation to enable volume controller attach/detach
I0505 07:05:32.453170 2827 cpu_manager.go:184] [cpumanager] starting with none policy
I0505 07:05:32.453181 2827 cpu_manager.go:185] [cpumanager] reconciling every 10s
I0505 07:05:32.453201 2827 state_mem.go:36] [cpumanager] initializing new in-memory state store
I0505 07:05:32.453196 2827 docker_service.go:353] docker cri received runtime config &RuntimeConfig{NetworkConfig:&NetworkConfig{PodCidr:10.2.2.0/24,},}
I0505 07:05:32.453325 2827 kubelet_network.go:77] Setting Pod CIDR: -> 10.2.2.0/24
I0505 07:05:32.453420 2827 state_mem.go:88] [cpumanager] updated default cpuset: ""
I0505 07:05:32.453432 2827 state_mem.go:96] [cpumanager] updated cpuset assignments: "map[]"
I0505 07:05:32.453446 2827 policy_none.go:43] [cpumanager] none policy: Start
I0505 07:05:32.454662 2827 kubelet_node_status.go:70] Attempting to register node suraj-lk-cluster-pool-1-worker-2
I0505 07:05:32.455391 2827 plugin_manager.go:114] Starting Kubelet Plugin Manager
E0505 07:05:32.455814 2827 kubelet.go:2187] Container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized
E0505 07:05:32.456286 2827 kubelet.go:2187] Container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized
I0505 07:05:32.469169 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:05:32.471886 2827 kubelet_node_status.go:112] Node suraj-lk-cluster-pool-1-worker-2 was previously registered
I0505 07:05:32.498728 2827 kubelet_node_status.go:73] Successfully registered node suraj-lk-cluster-pool-1-worker-2
I0505 07:05:32.498831 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:05:32.504793 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:05:32.653373 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "etc-machine-id" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-etc-machine-id") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653458 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "kubeconfig" (UniqueName: "kubernetes.io/configmap/4faaa862-18b5-4174-8d98-77f02b9f20a2-kubeconfig") pod "kube-proxy-xg828" (UID: "4faaa862-18b5-4174-8d98-77f02b9f20a2")
I0505 07:05:32.653492 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "mnt" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-mnt") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653512 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "logs" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-logs") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653531 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "os-release" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-os-release") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653609 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "sys" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-sys") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653679 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "lib-modules" (UniqueName: "kubernetes.io/host-path/4faaa862-18b5-4174-8d98-77f02b9f20a2-lib-modules") pod "kube-proxy-xg828" (UID: "4faaa862-18b5-4174-8d98-77f02b9f20a2")
I0505 07:05:32.653795 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "var-lib-calico" (UniqueName: "kubernetes.io/host-path/d4a0a18f-d30c-460a-9357-88cb3f1914f0-var-lib-calico") pod "calico-node-mrj2d" (UID: "d4a0a18f-d30c-460a-9357-88cb3f1914f0")
I0505 07:05:32.653825 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "cni-conf-dir" (UniqueName: "kubernetes.io/host-path/d4a0a18f-d30c-460a-9357-88cb3f1914f0-cni-conf-dir") pod "calico-node-mrj2d" (UID: "d4a0a18f-d30c-460a-9357-88cb3f1914f0")
I0505 07:05:32.653846 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "var-lib-kubelet" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-var-lib-kubelet") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653880 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "etc-resolv" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-etc-resolv") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653902 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "modules" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-modules") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653929 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "coreos-opt-cni-bin" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-coreos-opt-cni-bin") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.653969 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "calico-node-token-hcpx7" (UniqueName: "kubernetes.io/secret/d4a0a18f-d30c-460a-9357-88cb3f1914f0-calico-node-token-hcpx7") pod "calico-node-mrj2d" (UID: "d4a0a18f-d30c-460a-9357-88cb3f1914f0")
I0505 07:05:32.654013 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "etc-kubernetes" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-etc-kubernetes") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.654061 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "lib-modules" (UniqueName: "kubernetes.io/host-path/d4a0a18f-d30c-460a-9357-88cb3f1914f0-lib-modules") pod "calico-node-mrj2d" (UID: "d4a0a18f-d30c-460a-9357-88cb3f1914f0")
I0505 07:05:32.654140 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "var-run-calico" (UniqueName: "kubernetes.io/host-path/d4a0a18f-d30c-460a-9357-88cb3f1914f0-var-run-calico") pod "calico-node-mrj2d" (UID: "d4a0a18f-d30c-460a-9357-88cb3f1914f0")
I0505 07:05:32.654169 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "xtables-lock" (UniqueName: "kubernetes.io/host-path/d4a0a18f-d30c-460a-9357-88cb3f1914f0-xtables-lock") pod "calico-node-mrj2d" (UID: "d4a0a18f-d30c-460a-9357-88cb3f1914f0")
I0505 07:05:32.654198 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "kube-proxy-token-vrtp2" (UniqueName: "kubernetes.io/secret/4faaa862-18b5-4174-8d98-77f02b9f20a2-kube-proxy-token-vrtp2") pod "kube-proxy-xg828" (UID: "4faaa862-18b5-4174-8d98-77f02b9f20a2")
I0505 07:05:32.654221 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "coreos-var-lib-cni" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-coreos-var-lib-cni") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.654253 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "coreos-var-lib-calico" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-coreos-var-lib-calico") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.654307 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "dev" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-dev") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.654337 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "ssl-certs-host" (UniqueName: "kubernetes.io/host-path/4faaa862-18b5-4174-8d98-77f02b9f20a2-ssl-certs-host") pod "kube-proxy-xg828" (UID: "4faaa862-18b5-4174-8d98-77f02b9f20a2")
I0505 07:05:32.654357 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-run") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.654379 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "var-lib-docker" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-var-lib-docker") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.654399 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "iscsiadm" (UniqueName: "kubernetes.io/host-path/4f35dbfa-0e6a-4979-ac05-383bfc37cd01-iscsiadm") pod "kubelet-56n5x" (UID: "4f35dbfa-0e6a-4979-ac05-383bfc37cd01")
I0505 07:05:32.654450 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "cni-bin-dir" (UniqueName: "kubernetes.io/host-path/d4a0a18f-d30c-460a-9357-88cb3f1914f0-cni-bin-dir") pod "calico-node-mrj2d" (UID: "d4a0a18f-d30c-460a-9357-88cb3f1914f0")
I0505 07:05:32.654468 2827 reconciler.go:157] Reconciler: start to sync state
W0505 07:05:36.064054 2827 cni.go:237] Unable to update cni config: no networks found in /etc/kubernetes/cni/net.d
E0505 07:05:37.456703 2827 kubelet.go:2187] Container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized
W0505 07:05:41.064313 2827 cni.go:237] Unable to update cni config: no networks found in /etc/kubernetes/cni/net.d
E0505 07:05:42.457618 2827 kubelet.go:2187] Container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized
W0505 07:05:46.064557 2827 cni.go:237] Unable to update cni config: no networks found in /etc/kubernetes/cni/net.d
E0505 07:05:47.458952 2827 kubelet.go:2187] Container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized
E0505 07:06:03.528230 2827 reflector.go:382] object-"kube-system"/"calico-node-token-hcpx7": Failed to watch *v1.Secret: Get https://surajcluster.net.:6443/api/v1/namespaces/kube-system/secrets?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dcalico-node-token-hcpx7&resourceVersion=411&timeout=6m38s&timeoutSeconds=398&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:03.528230 2827 reflector.go:382] k8s.io/kubernetes/pkg/kubelet/kubelet.go:517: Failed to watch *v1.Service: Get https://surajcluster.net.:6443/api/v1/services?allowWatchBookmarks=true&resourceVersion=292&timeoutSeconds=500&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:03.528232 2827 reflector.go:382] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:46: Failed to watch *v1.Pod: Get https://surajcluster.net.:6443/api/v1/pods?allowWatchBookmarks=true&fieldSelector=spec.nodeName%3Dsuraj-lk-cluster-pool-1-worker-2&resourceVersion=1268&timeoutSeconds=351&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:03.528258 2827 reflector.go:382] k8s.io/client-go/informers/factory.go:135: Failed to watch *v1beta1.RuntimeClass: Get https://surajcluster.net.:6443/apis/node.k8s.io/v1beta1/runtimeclasses?allowWatchBookmarks=true&resourceVersion=1&timeout=8m49s&timeoutSeconds=529&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:03.528234 2827 reflector.go:382] object-"kube-system"/"calico-config": Failed to watch *v1.ConfigMap: Get https://surajcluster.net.:6443/api/v1/namespaces/kube-system/configmaps?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dcalico-config&resourceVersion=263&timeout=5m20s&timeoutSeconds=320&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:03.528232 2827 reflector.go:382] k8s.io/client-go/informers/factory.go:135: Failed to watch *v1.CSIDriver: Get https://surajcluster.net.:6443/apis/storage.k8s.io/v1/csidrivers?allowWatchBookmarks=true&resourceVersion=1&timeout=6m6s&timeoutSeconds=366&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:03.528288 2827 reflector.go:382] k8s.io/kubernetes/pkg/kubelet/kubelet.go:526: Failed to watch *v1.Node: Get https://surajcluster.net.:6443/api/v1/nodes?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dsuraj-lk-cluster-pool-1-worker-2&resourceVersion=1284&timeoutSeconds=527&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:03.528297 2827 reflector.go:382] object-"kube-system"/"kube-proxy-token-vrtp2": Failed to watch *v1.Secret: Get https://surajcluster.net.:6443/api/v1/namespaces/kube-system/secrets?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dkube-proxy-token-vrtp2&resourceVersion=411&timeout=8m13s&timeoutSeconds=493&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:03.528324 2827 reflector.go:382] object-"kube-system"/"kubeconfig-in-cluster": Failed to watch *v1.ConfigMap: Get https://surajcluster.net.:6443/api/v1/namespaces/kube-system/configmaps?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dkubeconfig-in-cluster&resourceVersion=263&timeout=8m54s&timeoutSeconds=534&watch=true: dial tcp 10.88.81.3:6443: connect: connection refused
E0505 07:06:12.203829 2827 reflector.go:382] object-"kube-system"/"calico-config": Failed to watch *v1.ConfigMap: unknown (get configmaps)
E0505 07:06:12.203876 2827 reflector.go:382] k8s.io/kubernetes/pkg/kubelet/kubelet.go:526: Failed to watch *v1.Node: unknown (get nodes)
E0505 07:06:12.203825 2827 reflector.go:382] k8s.io/client-go/informers/factory.go:135: Failed to watch *v1beta1.RuntimeClass: unknown (get runtimeclasses.node.k8s.io)
E0505 07:06:12.203851 2827 reflector.go:382] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:46: Failed to watch *v1.Pod: unknown (get pods)
E0505 07:06:12.203880 2827 reflector.go:382] object-"kube-system"/"calico-node-token-hcpx7": Failed to watch *v1.Secret: unknown (get secrets)
E0505 07:06:12.203880 2827 reflector.go:382] k8s.io/client-go/informers/factory.go:135: Failed to watch *v1.CSIDriver: unknown (get csidrivers.storage.k8s.io)
E0505 07:06:12.203880 2827 reflector.go:382] k8s.io/kubernetes/pkg/kubelet/kubelet.go:517: Failed to watch *v1.Service: unknown (get services)
E0505 07:06:12.203850 2827 reflector.go:382] object-"kube-system"/"kubeconfig-in-cluster": Failed to watch *v1.ConfigMap: unknown (get configmaps)
E0505 07:06:12.203934 2827 reflector.go:382] object-"kube-system"/"kube-proxy-token-vrtp2": Failed to watch *v1.Secret: unknown (get secrets)
E0505 07:06:12.216876 2827 reflector.go:178] k8s.io/client-go/informers/factory.go:135: Failed to list *v1beta1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "kubelet" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope
E0505 07:06:12.216879 2827 reflector.go:178] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:46: Failed to list *v1.Pod: pods is forbidden: User "kubelet" cannot list resource "pods" in API group "" at the cluster scope
E0505 07:06:12.216877 2827 reflector.go:178] object-"kube-system"/"calico-config": Failed to list *v1.ConfigMap: configmaps "calico-config" is forbidden: User "kubelet" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
E0505 07:06:12.216915 2827 reflector.go:178] object-"kube-system"/"calico-node-token-hcpx7": Failed to list *v1.Secret: secrets "calico-node-token-hcpx7" is forbidden: User "kubelet" cannot list resource "secrets" in API group "" in the namespace "kube-system"
E0505 07:06:12.220821 2827 reflector.go:178] k8s.io/kubernetes/pkg/kubelet/kubelet.go:526: Failed to list *v1.Node: nodes "suraj-lk-cluster-pool-1-worker-2" is forbidden: User "kubelet" cannot list resource "nodes" in API group "" at the cluster scope
E0505 07:06:12.226543 2827 reflector.go:178] k8s.io/client-go/informers/factory.go:135: Failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "kubelet" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
E0505 07:06:12.226596 2827 reflector.go:178] k8s.io/kubernetes/pkg/kubelet/kubelet.go:517: Failed to list *v1.Service: services is forbidden: User "kubelet" cannot list resource "services" in API group "" at the cluster scope
E0505 07:06:12.226889 2827 reflector.go:178] object-"kube-system"/"kubeconfig-in-cluster": Failed to list *v1.ConfigMap: configmaps "kubeconfig-in-cluster" is forbidden: User "kubelet" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
E0505 07:06:12.227027 2827 reflector.go:178] object-"kube-system"/"kube-proxy-token-vrtp2": Failed to list *v1.Secret: secrets "kube-proxy-token-vrtp2" is forbidden: User "kubelet" cannot list resource "secrets" in API group "" in the namespace "kube-system"
I0505 07:46:19.058261 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:46:19.192046 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "udev" (UniqueName: "kubernetes.io/host-path/c02b25b1-fc55-4507-b5f7-9cf3e4b370d3-udev") pod "rook-discover-6pkd6" (UID: "c02b25b1-fc55-4507-b5f7-9cf3e4b370d3")
I0505 07:46:19.192094 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "dev" (UniqueName: "kubernetes.io/host-path/c02b25b1-fc55-4507-b5f7-9cf3e4b370d3-dev") pod "rook-discover-6pkd6" (UID: "c02b25b1-fc55-4507-b5f7-9cf3e4b370d3")
I0505 07:46:19.192121 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-system-token-5ttlq" (UniqueName: "kubernetes.io/secret/c02b25b1-fc55-4507-b5f7-9cf3e4b370d3-rook-ceph-system-token-5ttlq") pod "rook-discover-6pkd6" (UID: "c02b25b1-fc55-4507-b5f7-9cf3e4b370d3")
I0505 07:46:19.192142 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "sys" (UniqueName: "kubernetes.io/host-path/c02b25b1-fc55-4507-b5f7-9cf3e4b370d3-sys") pod "rook-discover-6pkd6" (UID: "c02b25b1-fc55-4507-b5f7-9cf3e4b370d3")
2020-05-05 07:46:19.878 [INFO][16178] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-discover-6pkd6", ContainerID:"9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2"}}
2020-05-05 07:46:19.913 [INFO][16178] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0 rook-discover- rook c02b25b1-fc55-4507-b5f7-9cf3e4b370d3 7831 0 2020-05-05 07:46:19 +0000 UTC <nil> <nil> map[app:rook-discover controller-revision-hash:b4d97f4cc pod-template-generation:1 projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-system] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-discover-6pkd6 eth0 [] [] [kns.rook ksa.rook.rook-ceph-system] calid4a16cc8d5b []}} ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Namespace="rook" Pod="rook-discover-6pkd6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-"
2020-05-05 07:46:19.913 [INFO][16178] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Namespace="rook" Pod="rook-discover-6pkd6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0"
2020-05-05 07:46:19.918 [INFO][16178] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 07:46:19.922 [INFO][16178] k8s.go 784: pod info &Pod{ObjectMeta:{rook-discover-6pkd6 rook-discover- rook /api/v1/namespaces/rook/pods/rook-discover-6pkd6 c02b25b1-fc55-4507-b5f7-9cf3e4b370d3 7831 0 2020-05-05 07:46:19 +0000 UTC <nil> <nil> map[app:rook-discover controller-revision-hash:b4d97f4cc pod-template-generation:1] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 DaemonSet rook-discover 816a87c7-8cdd-4cce-b096-59351e73ad1a 0xc00043d799 0xc00043d79a}] [] [{kube-controller-manager Update v1 2020-05-05 07:46:19 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 103 101 110 101 114 97 116 101 78 97 109 101 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 112 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 45 114 101 118 105 115 105 111 110 45 104 97 115 104 34 58 123 125 44 34 102 58 112 111 100 45 116 101 109 112 108 97 116 101 45 103 101 110 101 114 97 116 105 111 110 34 58 123 125 125 44 34 102 58 111 119 110 101 114 82 101 102 101 114 101 110 99 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 117 105 100 92 34 58 92 34 56 49 54 97 56 55 99 55 45 56 99 100 100 45 52 99 99 101 45 98 48 57 54 45 53 57 51 53 49 101 55 51 97 100 49 97 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 98 108 111 99 107 79 119 110 101 114 68 101 108 101 116 105 111 110 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 34 58 123 125 44 34 102 58 107 105 110 100 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 117 105 100 34 58 123 125 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 97 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 65 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 113 117 105 114 101 100 68 117 114 105 110 103 83 99 104 101 100 117 108 105 110 103 73 103 110 111 114 101 100 68 117 114 105 110 103 69 120 101 99 117 116 105 111 110 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 84 101 114 109 115 34 58 123 125 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 100 105 115 99 111 118 101 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 78 79 68 69 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 83 80 65 67 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 34 46 34 58 123 125 44 34 102 58 112 114 105 118 105 108 101 103 101 100 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 114 117 110 47 117 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 115 121 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 101 110 97 98 108 101 83 101 114 118 105 99 101 76 105 110 107 115 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 115 101 114 118 105 99 101 65 99 99 111 117 110 116 34 58 123 125 44 34 102 58 115 101 114 118 105 99 101 65 99 99 111 117 110 116 78 97 109 101 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 116 111 108 101 114 97 116 105 111 110 115 34 58 123 125 44 34 102 58 118 111 108 117 109 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 115 121 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 117 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125],}} {kubelet Update v1 2020-05-05 07:46:19 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 99 111 110 100 105 116 105 111 110 115 34 58 123 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 67 111 110 116 97 105 110 101 114 115 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 73 110 105 116 105 97 108 105 122 101 100 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 104 111 115 116 73 80 34 58 123 125 44 34 102 58 115 116 97 114 116 84 105 109 101 34 58 123 125 125 125],}}]},Spec:PodSpec{Volumes:[]Volume{Volume{Name:dev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:sys,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/sys,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-system-token-5ttlq,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-system-token-5ttlq,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:rook-discover,Image:rook/ceph:v1.3.2,Command:[],Args:[discover --discover-interval 60m --use-ceph-volume],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:dev,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:sys,ReadOnly:true,MountPath:/sys,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:udev,ReadOnly:true,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-system-token-5ttlq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{},ServiceAccountName:rook-ceph-system,DeprecatedServiceAccount:rook-ceph-system,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:false,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{},MatchFields:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:metadata.name,Operator:In,Values:[suraj-lk-cluster-pool-1-worker-2],},},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Exists,Value:,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/disk-pressure,Operator:Exists,Value:,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/memory-pressure,Operator:Exists,Value:,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/pid-pressure,Operator:Exists,Value:,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/unschedulable,Operator:Exists,Value:,Effect:NoSchedule,TolerationSeconds:nil,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:46:19 +0000 UTC,Reason:,Message:,},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:46:19 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [rook-discover],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:46:19 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [rook-discover],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:46:19 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 07:46:19 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:rook-discover,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:rook/ceph:v1.3.2,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 07:46:19.938 [INFO][16194] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" HandleID="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0"
2020-05-05 07:46:19.961 [INFO][16194] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2 ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" HandleID="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0"
2020-05-05 07:46:19.961 [INFO][16194] ipam_plugin.go 233: Auto assigning IP ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" HandleID="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c8d40), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-discover-6pkd6"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 07:46:19.961 [INFO][16194] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 07:46:19.970 [INFO][16194] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:19.978 [INFO][16194] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:19.982 [INFO][16194] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:19.987 [INFO][16194] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:19.987 [INFO][16194] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:19.990 [INFO][16194] ipam.go 1265: Creating new handle: k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2
2020-05-05 07:46:19.997 [INFO][16194] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:20.007 [INFO][16194] ipam.go 840: Successfully claimed IPs: [10.2.76.1/24] block=10.2.76.0/24 handle="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:20.007 [INFO][16194] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.1/24] handle="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:20.012 [INFO][16194] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.1/24] handle="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:46:20.012 [INFO][16194] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.1/24] IPv6=[] ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" HandleID="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0"
2020-05-05 07:46:20.012 [INFO][16194] ipam_plugin.go 261: IPAM Result ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" HandleID="k8s-pod-network.9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0001086c0)}
2020-05-05 07:46:20.013 [INFO][16178] k8s.go 358: Populated endpoint ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Namespace="rook" Pod="rook-discover-6pkd6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0", GenerateName:"rook-discover-", Namespace:"rook", SelfLink:"", UID:"c02b25b1-fc55-4507-b5f7-9cf3e4b370d3", ResourceVersion:"7831", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261579, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-discover", "controller-revision-hash":"b4d97f4cc", "pod-template-generation":"1", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-system"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-discover-6pkd6", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-system"}, InterfaceName:"calid4a16cc8d5b", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:46:20.013 [INFO][16178] k8s.go 359: Calico CNI using IPs: [10.2.76.1/32] ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Namespace="rook" Pod="rook-discover-6pkd6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0"
2020-05-05 07:46:20.013 [INFO][16178] network_linux.go 76: Setting the host side veth name to calid4a16cc8d5b ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Namespace="rook" Pod="rook-discover-6pkd6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0"
2020-05-05 07:46:20.014 [INFO][16178] network_linux.go 396: Disabling IPv4 forwarding ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Namespace="rook" Pod="rook-discover-6pkd6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0"
2020-05-05 07:46:20.027 [INFO][16178] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Namespace="rook" Pod="rook-discover-6pkd6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0", GenerateName:"rook-discover-", Namespace:"rook", SelfLink:"", UID:"c02b25b1-fc55-4507-b5f7-9cf3e4b370d3", ResourceVersion:"7831", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261579, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-discover", "controller-revision-hash":"b4d97f4cc", "pod-template-generation":"1", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-system"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2", Pod:"rook-discover-6pkd6", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-system"}, InterfaceName:"calid4a16cc8d5b", MAC:"ea:e6:58:07:63:69", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:46:20.038 [INFO][16178] k8s.go 417: Wrote updated endpoint to datastore ContainerID="9a9088779553e28609509a6a1d3c9dea76aa8e29f6ded525c7b13283e4c1a8e2" Namespace="rook" Pod="rook-discover-6pkd6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--discover--6pkd6-eth0"
I0505 07:47:23.059859 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:47:23.128584 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "pods-mount-dir" (UniqueName: "kubernetes.io/host-path/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-pods-mount-dir") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.128624 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "lib-modules" (UniqueName: "kubernetes.io/host-path/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-lib-modules") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.128676 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-run-mount" (UniqueName: "kubernetes.io/host-path/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-host-run-mount") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.128766 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-dev" (UniqueName: "kubernetes.io/host-path/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-host-dev") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.128805 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "plugin-dir" (UniqueName: "kubernetes.io/host-path/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-plugin-dir") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.128861 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-sys" (UniqueName: "kubernetes.io/host-path/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-host-sys") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.128917 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "ceph-csi-config" (UniqueName: "kubernetes.io/configmap/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-ceph-csi-config") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.128954 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-csi-rbd-plugin-sa-token-22j52" (UniqueName: "kubernetes.io/secret/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-rook-csi-rbd-plugin-sa-token-22j52") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.128980 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "plugin-mount-dir" (UniqueName: "kubernetes.io/host-path/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-plugin-mount-dir") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.129000 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "registration-dir" (UniqueName: "kubernetes.io/host-path/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-registration-dir") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.129021 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "keys-tmp-dir" (UniqueName: "kubernetes.io/empty-dir/f6fd34c7-7fa9-4e8c-8356-4542c6037b15-keys-tmp-dir") pod "csi-rbdplugin-vfj6g" (UID: "f6fd34c7-7fa9-4e8c-8356-4542c6037b15")
I0505 07:47:23.152136 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:47:23.229362 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "ceph-csi-config" (UniqueName: "kubernetes.io/configmap/b6c9888f-7e9e-4b90-b1c5-089efc45b5fa-ceph-csi-config") pod "csi-rbdplugin-provisioner-8886bfcfb-t2rln" (UID: "b6c9888f-7e9e-4b90-b1c5-089efc45b5fa")
I0505 07:47:23.229716 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-sys" (UniqueName: "kubernetes.io/host-path/b6c9888f-7e9e-4b90-b1c5-089efc45b5fa-host-sys") pod "csi-rbdplugin-provisioner-8886bfcfb-t2rln" (UID: "b6c9888f-7e9e-4b90-b1c5-089efc45b5fa")
I0505 07:47:23.229750 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "socket-dir" (UniqueName: "kubernetes.io/empty-dir/b6c9888f-7e9e-4b90-b1c5-089efc45b5fa-socket-dir") pod "csi-rbdplugin-provisioner-8886bfcfb-t2rln" (UID: "b6c9888f-7e9e-4b90-b1c5-089efc45b5fa")
I0505 07:47:23.229810 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-csi-rbd-provisioner-sa-token-6zqrp" (UniqueName: "kubernetes.io/secret/b6c9888f-7e9e-4b90-b1c5-089efc45b5fa-rook-csi-rbd-provisioner-sa-token-6zqrp") pod "csi-rbdplugin-provisioner-8886bfcfb-t2rln" (UID: "b6c9888f-7e9e-4b90-b1c5-089efc45b5fa")
I0505 07:47:23.229963 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-dev" (UniqueName: "kubernetes.io/host-path/b6c9888f-7e9e-4b90-b1c5-089efc45b5fa-host-dev") pod "csi-rbdplugin-provisioner-8886bfcfb-t2rln" (UID: "b6c9888f-7e9e-4b90-b1c5-089efc45b5fa")
I0505 07:47:23.229992 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "lib-modules" (UniqueName: "kubernetes.io/host-path/b6c9888f-7e9e-4b90-b1c5-089efc45b5fa-lib-modules") pod "csi-rbdplugin-provisioner-8886bfcfb-t2rln" (UID: "b6c9888f-7e9e-4b90-b1c5-089efc45b5fa")
I0505 07:47:23.230077 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "keys-tmp-dir" (UniqueName: "kubernetes.io/empty-dir/b6c9888f-7e9e-4b90-b1c5-089efc45b5fa-keys-tmp-dir") pod "csi-rbdplugin-provisioner-8886bfcfb-t2rln" (UID: "b6c9888f-7e9e-4b90-b1c5-089efc45b5fa")
E0505 07:47:23.434039 2827 kuberuntime_manager.go:937] PodSandboxStatus of sandbox "1e7568276036f1b08be069df7853f543296b7898dcab9e73c1b1bf921175177f" for pod "csi-rbdplugin-vfj6g_rook(f6fd34c7-7fa9-4e8c-8356-4542c6037b15)" error: rpc error: code = Unknown desc = Error: No such container: 1e7568276036f1b08be069df7853f543296b7898dcab9e73c1b1bf921175177f
I0505 07:47:23.682923 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:47:23.717960 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:47:23.832597 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-csi-cephfs-plugin-sa-token-qthf8" (UniqueName: "kubernetes.io/secret/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-rook-csi-cephfs-plugin-sa-token-qthf8") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.832667 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "csi-plugins-dir" (UniqueName: "kubernetes.io/host-path/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-csi-plugins-dir") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.832761 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-run-mount" (UniqueName: "kubernetes.io/host-path/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-host-run-mount") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.832809 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "socket-dir" (UniqueName: "kubernetes.io/empty-dir/e03c25a1-e894-4016-b65d-5a928e97376c-socket-dir") pod "csi-cephfsplugin-provisioner-6944f4676d-l6jzs" (UID: "e03c25a1-e894-4016-b65d-5a928e97376c")
I0505 07:47:23.832833 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "lib-modules" (UniqueName: "kubernetes.io/host-path/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-lib-modules") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.832856 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-sys" (UniqueName: "kubernetes.io/host-path/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-host-sys") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.832881 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "plugin-dir" (UniqueName: "kubernetes.io/host-path/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-plugin-dir") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.833217 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-sys" (UniqueName: "kubernetes.io/host-path/e03c25a1-e894-4016-b65d-5a928e97376c-host-sys") pod "csi-cephfsplugin-provisioner-6944f4676d-l6jzs" (UID: "e03c25a1-e894-4016-b65d-5a928e97376c")
I0505 07:47:23.833290 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-dev" (UniqueName: "kubernetes.io/host-path/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-host-dev") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.833344 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "ceph-csi-config" (UniqueName: "kubernetes.io/configmap/e03c25a1-e894-4016-b65d-5a928e97376c-ceph-csi-config") pod "csi-cephfsplugin-provisioner-6944f4676d-l6jzs" (UID: "e03c25a1-e894-4016-b65d-5a928e97376c")
I0505 07:47:23.833391 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "registration-dir" (UniqueName: "kubernetes.io/host-path/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-registration-dir") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.833445 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "pods-mount-dir" (UniqueName: "kubernetes.io/host-path/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-pods-mount-dir") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.833496 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "keys-tmp-dir" (UniqueName: "kubernetes.io/empty-dir/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-keys-tmp-dir") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.833542 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "mount-cache-dir" (UniqueName: "kubernetes.io/empty-dir/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-mount-cache-dir") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.833609 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "host-dev" (UniqueName: "kubernetes.io/host-path/e03c25a1-e894-4016-b65d-5a928e97376c-host-dev") pod "csi-cephfsplugin-provisioner-6944f4676d-l6jzs" (UID: "e03c25a1-e894-4016-b65d-5a928e97376c")
I0505 07:47:23.833693 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "ceph-csi-config" (UniqueName: "kubernetes.io/configmap/f9fd357f-fdaa-410c-8b5c-9a44feb0bb75-ceph-csi-config") pod "csi-cephfsplugin-m52rm" (UID: "f9fd357f-fdaa-410c-8b5c-9a44feb0bb75")
I0505 07:47:23.833737 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-csi-cephfs-provisioner-sa-token-6zxss" (UniqueName: "kubernetes.io/secret/e03c25a1-e894-4016-b65d-5a928e97376c-rook-csi-cephfs-provisioner-sa-token-6zxss") pod "csi-cephfsplugin-provisioner-6944f4676d-l6jzs" (UID: "e03c25a1-e894-4016-b65d-5a928e97376c")
I0505 07:47:23.833766 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "lib-modules" (UniqueName: "kubernetes.io/host-path/e03c25a1-e894-4016-b65d-5a928e97376c-lib-modules") pod "csi-cephfsplugin-provisioner-6944f4676d-l6jzs" (UID: "e03c25a1-e894-4016-b65d-5a928e97376c")
I0505 07:47:23.833804 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "keys-tmp-dir" (UniqueName: "kubernetes.io/empty-dir/e03c25a1-e894-4016-b65d-5a928e97376c-keys-tmp-dir") pod "csi-cephfsplugin-provisioner-6944f4676d-l6jzs" (UID: "e03c25a1-e894-4016-b65d-5a928e97376c")
2020-05-05 07:47:24.037 [INFO][18278] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"csi-rbdplugin-provisioner-8886bfcfb-t2rln", ContainerID:"faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d"}}
2020-05-05 07:47:24.078 [INFO][18278] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0 csi-rbdplugin-provisioner-8886bfcfb- rook b6c9888f-7e9e-4b90-b1c5-089efc45b5fa 8155 0 2020-05-05 07:47:23 +0000 UTC <nil> <nil> map[app:csi-rbdplugin-provisioner contains:csi-rbdplugin-metrics pod-template-hash:8886bfcfb projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-csi-rbd-provisioner-sa] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 csi-rbdplugin-provisioner-8886bfcfb-t2rln eth0 [] [] [kns.rook ksa.rook.rook-csi-rbd-provisioner-sa] cali1a917eefeff []}} ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Namespace="rook" Pod="csi-rbdplugin-provisioner-8886bfcfb-t2rln" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-"
2020-05-05 07:47:24.078 [INFO][18278] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Namespace="rook" Pod="csi-rbdplugin-provisioner-8886bfcfb-t2rln" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0"
2020-05-05 07:47:24.084 [INFO][18278] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 07:47:24.090 [INFO][18278] k8s.go 784: pod info &Pod{ObjectMeta:{csi-rbdplugin-provisioner-8886bfcfb-t2rln csi-rbdplugin-provisioner-8886bfcfb- rook /api/v1/namespaces/rook/pods/csi-rbdplugin-provisioner-8886bfcfb-t2rln b6c9888f-7e9e-4b90-b1c5-089efc45b5fa 8155 0 2020-05-05 07:47:23 +0000 UTC <nil> <nil> map[app:csi-rbdplugin-provisioner contains:csi-rbdplugin-metrics pod-template-hash:8886bfcfb] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet csi-rbdplugin-provisioner-8886bfcfb 59643314-d016-443e-a5b6-bbd861fad0c7 0xc00044e1ba 0xc00044e1bb}] [] [{kube-controller-manager Update v1 2020-05-05 07:47:23 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 103 101 110 101 114 97 116 101 78 97 109 101 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 112 34 58 123 125 44 34 102 58 99 111 110 116 97 105 110 115 34 58 123 125 44 34 102 58 112 111 100 45 116 101 109 112 108 97 116 101 45 104 97 115 104 34 58 123 125 125 44 34 102 58 111 119 110 101 114 82 101 102 101 114 101 110 99 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 117 105 100 92 34 58 92 34 53 57 54 52 51 51 49 52 45 100 48 49 54 45 52 52 51 101 45 97 53 98 54 45 98 98 100 56 54 49 102 97 100 48 99 55 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 98 108 111 99 107 79 119 110 101 114 68 101 108 101 116 105 111 110 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 34 58 123 125 44 34 102 58 107 105 110 100 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 117 105 100 34 58 123 125 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 97 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 65 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 113 117 105 114 101 100 68 117 114 105 110 103 83 99 104 101 100 117 108 105 110 103 73 103 110 111 114 101 100 68 117 114 105 110 103 69 120 101 99 117 116 105 111 110 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 84 101 114 109 115 34 58 123 125 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 112 114 111 118 105 115 105 111 110 101 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 65 68 68 82 69 83 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 114 98 100 112 108 117 103 105 110 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 83 73 95 69 78 68 80 79 73 78 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 78 79 68 69 95 73 68 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 73 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 83 80 65 67 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 45 99 115 105 45 99 111 110 102 105 103 47 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 108 105 98 47 109 111 100 117 108 101 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 115 121 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 116 109 112 47 99 115 105 47 107 101 121 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 114 98 100 112 108 117 103 105 110 45 97 116 116 97 99 104 101 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 65 68 68 82 69 83 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 114 101 115 105 122 101 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 65 68 68 82 69 83 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 115 110 97 112 115 104 111 116 116 101 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 65 68 68 82 69 83 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 108 105 118 101 110 101 115 115 45 112 114 111 109 101 116 104 101 117 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 83 73 95 69 78 68 80 79 73 78 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 73 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 101 110 97 98 108 101 83 101 114 118 105 99 101 76 105 110 107 115 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 115 101 114 118 105 99 101 65 99 99 111 117 110 116 34 58 123 125 44 34 102 58 115 101 114 118 105 99 101 65 99 99 111 117 110 116 78 97 109 101 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 116 111 108 101 114 97 116 105 111 110 115 34 58 123 125 44 34 102 58 118 111 108 117 109 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 101 112 104 45 99 115 105 45 99 111 110 102 105 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 110 102 105 103 77 97 112 34 58 123 34 46 34 58 123 125 44 34 102 58 100 101 102 97 117 108 116 77 111 100 101 34 58 123 125 44 34 102 58 105 116 101 109 115 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 104 111 115 116 45 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 104 111 115 116 45 115 121 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 107 101 121 115 45 116 109 112 45 100 105 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 101 109 112 116 121 68 105 114 34 58 123 34 46 34 58 123 125 44 34 102 58 109 101 100 105 117 109 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 108 105 98 45 109 111 100 117 108 101 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 115 111 99 107 101 116 45 100 105 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 101 109 112 116 121 68 105 114 34 58 123 34 46 34 58 123 125 44 34 102 58 109 101 100 105 117 109 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125],}} {kubelet Update v1 2020-05-05 07:47:23 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 99 111 110 100 105 116 105 111 110 115 34 58 123 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 67 111 110 116 97 105 110 101 114 115 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 73 110 105 116 105 97 108 105 122 101 100 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 104 111 115 116 73 80 34 58 123 125 44 34 102 58 115 116 97 114 116 84 105 109 101 34 58 123 125 125 125],}}]},Spec:PodSpec{Volumes:[]Volume{Volume{Name:host-dev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:host-sys,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/sys,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:lib-modules,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/lib/modules,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:socket-dir,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:Memory,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:ceph-csi-config,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-ceph-csi-config,},Items:[]KeyToPath{KeyToPath{Key:csi-cluster-config-json,Path:config.json,Mode:nil,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:keys-tmp-dir,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:Memory,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-csi-rbd-provisioner-sa-token-6zqrp,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-csi-rbd-provisioner-sa-token-6zqrp,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:csi-provisioner,Image:quay.io/k8scsi/csi-provisioner:v1.4.0,Command:[],Args:[--csi-address=$(ADDRESS) --v=0 --timeout=150s --retry-interval-start=500ms --enable-leader-election=true --leader-election-type=leases --leader-election-namespace=rook],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-rbd-provisioner-sa-token-6zqrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:csi-resizer,Image:quay.io/k8scsi/csi-resizer:v0.4.0,Command:[],Args:[--csi-address=$(ADDRESS) --v=0 --csiTimeout=150s --leader-election --leader-election-namespace=rook],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-rbd-provisioner-sa-token-6zqrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:csi-rbdplugin-attacher,Image:quay.io/k8scsi/csi-attacher:v2.1.0,Command:[],Args:[--v=0 --timeout=150s --csi-address=$(ADDRESS) --leader-election=true --leader-election-namespace=rook],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-rbd-provisioner-sa-token-6zqrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:csi-snapshotter,Image:quay.io/k8scsi/csi-snapshotter:v1.2.2,Command:[],Args:[--csi-address=$(ADDRESS) --v=0 --timeout=150s --leader-election=true --leader-election-namespace=rook],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-rbd-provisioner-sa-token-6zqrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:csi-rbdplugin,Image:quay.io/cephcsi/cephcsi:v2.1.0,Command:[],Args:[--nodeid=$(NODE_ID) --endpoint=$(CSI_ENDPOINT) --v=0 --type=rbd --controllerserver=true --drivername=rook.rbd.csi.ceph.com --pidlimit=-1 --metricsport=9090 --metricspath=/metrics --enablegrpcmetrics=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CSI_ENDPOINT,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:host-dev,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:host-sys,ReadOnly:false,MountPath:/sys,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:lib-modules,ReadOnly:true,MountPath:/lib/modules,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:ceph-csi-config,ReadOnly:false,MountPath:/etc/ceph-csi-config/,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:keys-tmp-dir,ReadOnly:false,MountPath:/tmp/csi/keys,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-rbd-provisioner-sa-token-6zqrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:liveness-prometheus,Image:quay.io/cephcsi/cephcsi:v2.1.0,Command:[],Args:[--type=liveness --endpoint=$(CSI_ENDPOINT) --metricsport=9080 --metricspath=/metrics --polltime=60s --timeout=3s],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CSI_ENDPOINT,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-rbd-provisioner-sa-token-6zqrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{},ServiceAccountName:rook-csi-rbd-provisioner-sa,DeprecatedServiceAccount:rook-csi-rbd-provisioner-sa,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:false,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:In,Values:[ceph],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:47:23 +0000 UTC,Reason:,Message:,},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:47:23 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [csi-provisioner csi-resizer csi-rbdplugin-attacher csi-snapshotter csi-rbdplugin liveness-prometheus],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:47:23 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [csi-provisioner csi-resizer csi-rbdplugin-attacher csi-snapshotter csi-rbdplugin liveness-prometheus],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:47:23 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 07:47:23 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:csi-provisioner,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/k8scsi/csi-provisioner:v1.4.0,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:csi-rbdplugin,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/cephcsi/cephcsi:v2.1.0,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:csi-rbdplugin-attacher,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/k8scsi/csi-attacher:v2.1.0,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:csi-resizer,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/k8scsi/csi-resizer:v0.4.0,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:csi-snapshotter,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/k8scsi/csi-snapshotter:v1.2.2,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:liveness-prometheus,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/cephcsi/cephcsi:v2.1.0,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 07:47:24.107 [INFO][18328] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" HandleID="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0"
2020-05-05 07:47:24.131 [INFO][18328] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" HandleID="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0"
2020-05-05 07:47:24.131 [INFO][18328] ipam_plugin.go 233: Auto assigning IP ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" HandleID="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00020f9c0), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"csi-rbdplugin-provisioner-8886bfcfb-t2rln"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 07:47:24.131 [INFO][18328] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 07:47:24.141 [INFO][18328] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.149 [INFO][18328] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.156 [INFO][18328] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.162 [INFO][18328] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.162 [INFO][18328] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.165 [INFO][18328] ipam.go 1265: Creating new handle: k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d
2020-05-05 07:47:24.173 [INFO][18328] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.184 [INFO][18328] ipam.go 840: Successfully claimed IPs: [10.2.76.2/24] block=10.2.76.0/24 handle="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.184 [INFO][18328] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.2/24] handle="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.188 [INFO][18328] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.2/24] handle="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.188 [INFO][18328] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.2/24] IPv6=[] ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" HandleID="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0"
2020-05-05 07:47:24.188 [INFO][18328] ipam_plugin.go 261: IPAM Result ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" HandleID="k8s-pod-network.faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0000c6300)}
2020-05-05 07:47:24.190 [INFO][18278] k8s.go 358: Populated endpoint ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Namespace="rook" Pod="csi-rbdplugin-provisioner-8886bfcfb-t2rln" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0", GenerateName:"csi-rbdplugin-provisioner-8886bfcfb-", Namespace:"rook", SelfLink:"", UID:"b6c9888f-7e9e-4b90-b1c5-089efc45b5fa", ResourceVersion:"8155", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261643, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"csi-rbdplugin-provisioner", "contains":"csi-rbdplugin-metrics", "pod-template-hash":"8886bfcfb", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-csi-rbd-provisioner-sa"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"csi-rbdplugin-provisioner-8886bfcfb-t2rln", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-csi-rbd-provisioner-sa"}, InterfaceName:"cali1a917eefeff", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:47:24.191 [INFO][18278] k8s.go 359: Calico CNI using IPs: [10.2.76.2/32] ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Namespace="rook" Pod="csi-rbdplugin-provisioner-8886bfcfb-t2rln" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0"
2020-05-05 07:47:24.191 [INFO][18278] network_linux.go 76: Setting the host side veth name to cali1a917eefeff ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Namespace="rook" Pod="csi-rbdplugin-provisioner-8886bfcfb-t2rln" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0"
2020-05-05 07:47:24.192 [INFO][18278] network_linux.go 396: Disabling IPv4 forwarding ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Namespace="rook" Pod="csi-rbdplugin-provisioner-8886bfcfb-t2rln" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0"
2020-05-05 07:47:24.211 [INFO][18278] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Namespace="rook" Pod="csi-rbdplugin-provisioner-8886bfcfb-t2rln" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0", GenerateName:"csi-rbdplugin-provisioner-8886bfcfb-", Namespace:"rook", SelfLink:"", UID:"b6c9888f-7e9e-4b90-b1c5-089efc45b5fa", ResourceVersion:"8155", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261643, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"csi-rbdplugin-provisioner", "contains":"csi-rbdplugin-metrics", "pod-template-hash":"8886bfcfb", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-csi-rbd-provisioner-sa"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d", Pod:"csi-rbdplugin-provisioner-8886bfcfb-t2rln", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-csi-rbd-provisioner-sa"}, InterfaceName:"cali1a917eefeff", MAC:"66:3c:f6:83:17:84", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:47:24.229 [INFO][18278] k8s.go 417: Wrote updated endpoint to datastore ContainerID="faf097e583877b92cdce6da1fb86b1ed518e9c488474a50f7719e6f0ff907f5d" Namespace="rook" Pod="csi-rbdplugin-provisioner-8886bfcfb-t2rln" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--rbdplugin--provisioner--8886bfcfb--t2rln-eth0"
W0505 07:47:24.555845 2827 pod_container_deletor.go:77] Container "9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" not found in pod's containers
W0505 07:47:24.565124 2827 pod_container_deletor.go:77] Container "f0b581abee1c2e25f401267548ffa130ed453bd804c780718a2d7148eda6c6c0" not found in pod's containers
2020-05-05 07:47:24.574 [INFO][18487] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"csi-cephfsplugin-provisioner-6944f4676d-l6jzs", ContainerID:"9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee"}}
2020-05-05 07:47:24.605 [INFO][18487] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0 csi-cephfsplugin-provisioner-6944f4676d- rook e03c25a1-e894-4016-b65d-5a928e97376c 8195 0 2020-05-05 07:47:23 +0000 UTC <nil> <nil> map[app:csi-cephfsplugin-provisioner contains:csi-cephfsplugin-metrics pod-template-hash:6944f4676d projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-csi-cephfs-provisioner-sa] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 csi-cephfsplugin-provisioner-6944f4676d-l6jzs eth0 [] [] [kns.rook ksa.rook.rook-csi-cephfs-provisioner-sa] cali49faf4b818c []}} ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Namespace="rook" Pod="csi-cephfsplugin-provisioner-6944f4676d-l6jzs" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-"
2020-05-05 07:47:24.605 [INFO][18487] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Namespace="rook" Pod="csi-cephfsplugin-provisioner-6944f4676d-l6jzs" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0"
2020-05-05 07:47:24.610 [INFO][18487] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 07:47:24.614 [INFO][18487] k8s.go 784: pod info &Pod{ObjectMeta:{csi-cephfsplugin-provisioner-6944f4676d-l6jzs csi-cephfsplugin-provisioner-6944f4676d- rook /api/v1/namespaces/rook/pods/csi-cephfsplugin-provisioner-6944f4676d-l6jzs e03c25a1-e894-4016-b65d-5a928e97376c 8195 0 2020-05-05 07:47:23 +0000 UTC <nil> <nil> map[app:csi-cephfsplugin-provisioner contains:csi-cephfsplugin-metrics pod-template-hash:6944f4676d] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet csi-cephfsplugin-provisioner-6944f4676d 1ddca393-ab19-4212-9aff-d2fa74c03728 0xc000102cea 0xc000102ceb}] [] [{kube-controller-manager Update v1 2020-05-05 07:47:23 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 103 101 110 101 114 97 116 101 78 97 109 101 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 112 34 58 123 125 44 34 102 58 99 111 110 116 97 105 110 115 34 58 123 125 44 34 102 58 112 111 100 45 116 101 109 112 108 97 116 101 45 104 97 115 104 34 58 123 125 125 44 34 102 58 111 119 110 101 114 82 101 102 101 114 101 110 99 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 117 105 100 92 34 58 92 34 49 100 100 99 97 51 57 51 45 97 98 49 57 45 52 50 49 50 45 57 97 102 102 45 100 50 102 97 55 52 99 48 51 55 50 56 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 98 108 111 99 107 79 119 110 101 114 68 101 108 101 116 105 111 110 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 34 58 123 125 44 34 102 58 107 105 110 100 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 117 105 100 34 58 123 125 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 97 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 65 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 113 117 105 114 101 100 68 117 114 105 110 103 83 99 104 101 100 117 108 105 110 103 73 103 110 111 114 101 100 68 117 114 105 110 103 69 120 101 99 117 116 105 111 110 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 84 101 114 109 115 34 58 123 125 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 97 116 116 97 99 104 101 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 65 68 68 82 69 83 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 99 101 112 104 102 115 112 108 117 103 105 110 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 83 73 95 69 78 68 80 79 73 78 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 78 79 68 69 95 73 68 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 73 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 83 80 65 67 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 45 99 115 105 45 99 111 110 102 105 103 47 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 108 105 98 47 109 111 100 117 108 101 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 115 121 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 116 109 112 47 99 115 105 47 107 101 121 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 112 114 111 118 105 115 105 111 110 101 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 65 68 68 82 69 83 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 115 105 45 114 101 115 105 122 101 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 65 68 68 82 69 83 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 108 105 118 101 110 101 115 115 45 112 114 111 109 101 116 104 101 117 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 83 73 95 69 78 68 80 79 73 78 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 73 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 99 115 105 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 101 110 97 98 108 101 83 101 114 118 105 99 101 76 105 110 107 115 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 115 101 114 118 105 99 101 65 99 99 111 117 110 116 34 58 123 125 44 34 102 58 115 101 114 118 105 99 101 65 99 99 111 117 110 116 78 97 109 101 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 116 111 108 101 114 97 116 105 111 110 115 34 58 123 125 44 34 102 58 118 111 108 117 109 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 101 112 104 45 99 115 105 45 99 111 110 102 105 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 110 102 105 103 77 97 112 34 58 123 34 46 34 58 123 125 44 34 102 58 100 101 102 97 117 108 116 77 111 100 101 34 58 123 125 44 34 102 58 105 116 101 109 115 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 104 111 115 116 45 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 104 111 115 116 45 115 121 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 107 101 121 115 45 116 109 112 45 100 105 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 101 109 112 116 121 68 105 114 34 58 123 34 46 34 58 123 125 44 34 102 58 109 101 100 105 117 109 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 108 105 98 45 109 111 100 117 108 101 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 115 111 99 107 101 116 45 100 105 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 101 109 112 116 121 68 105 114 34 58 123 34 46 34 58 123 125 44 34 102 58 109 101 100 105 117 109 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125],}} {kubelet Update v1 2020-05-05 07:47:23 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 99 111 110 100 105 116 105 111 110 115 34 58 123 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 67 111 110 116 97 105 110 101 114 115 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 73 110 105 116 105 97 108 105 122 101 100 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 104 111 115 116 73 80 34 58 123 125 44 34 102 58 115 116 97 114 116 84 105 109 101 34 58 123 125 125 125],}}]},Spec:PodSpec{Volumes:[]Volume{Volume{Name:socket-dir,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:Memory,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:host-sys,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/sys,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:lib-modules,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/lib/modules,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:host-dev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:ceph-csi-config,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-ceph-csi-config,},Items:[]KeyToPath{KeyToPath{Key:csi-cluster-config-json,Path:config.json,Mode:nil,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:keys-tmp-dir,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:Memory,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-csi-cephfs-provisioner-sa-token-6zxss,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-csi-cephfs-provisioner-sa-token-6zxss,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:csi-attacher,Image:quay.io/k8scsi/csi-attacher:v2.1.0,Command:[],Args:[--v=0 --csi-address=$(ADDRESS) --leader-election=true --timeout=150s --leader-election-namespace=rook],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-cephfs-provisioner-sa-token-6zxss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:csi-resizer,Image:quay.io/k8scsi/csi-resizer:v0.4.0,Command:[],Args:[--csi-address=$(ADDRESS) --v=0 --csiTimeout=150s --leader-election --leader-election-namespace=rook],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-cephfs-provisioner-sa-token-6zxss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:csi-provisioner,Image:quay.io/k8scsi/csi-provisioner:v1.4.0,Command:[],Args:[--csi-address=$(ADDRESS) --v=0 --timeout=150s --retry-interval-start=500ms --enable-leader-election=true --leader-election-type=leases --leader-election-namespace=rook],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-cephfs-provisioner-sa-token-6zxss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:csi-cephfsplugin,Image:quay.io/cephcsi/cephcsi:v2.1.0,Command:[],Args:[--nodeid=$(NODE_ID) --type=cephfs --endpoint=$(CSI_ENDPOINT) --v=0 --controllerserver=true --drivername=rook.cephfs.csi.ceph.com --metadatastorage=k8s_configmap --pidlimit=-1 --metricsport=9092 --forcecephkernelclient=true --metricspath=/metrics --enablegrpcmetrics=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CSI_ENDPOINT,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:host-sys,ReadOnly:false,MountPath:/sys,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:lib-modules,ReadOnly:true,MountPath:/lib/modules,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:host-dev,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:ceph-csi-config,ReadOnly:false,MountPath:/etc/ceph-csi-config/,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:keys-tmp-dir,ReadOnly:false,MountPath:/tmp/csi/keys,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-cephfs-provisioner-sa-token-6zxss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:liveness-prometheus,Image:quay.io/cephcsi/cephcsi:v2.1.0,Command:[],Args:[--type=liveness --endpoint=$(CSI_ENDPOINT) --metricsport=9081 --metricspath=/metrics --polltime=60s --timeout=3s],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CSI_ENDPOINT,Value:unix:///csi/csi-provisioner.sock,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-csi-cephfs-provisioner-sa-token-6zxss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{},ServiceAccountName:rook-csi-cephfs-provisioner-sa,DeprecatedServiceAccount:rook-csi-cephfs-provisioner-sa,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:false,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:In,Values:[ceph],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:47:23 +0000 UTC,Reason:,Message:,},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:47:23 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [csi-attacher csi-resizer csi-provisioner csi-cephfsplugin liveness-prometheus],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:47:23 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [csi-attacher csi-resizer csi-provisioner csi-cephfsplugin liveness-prometheus],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:47:23 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 07:47:23 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:csi-attacher,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/k8scsi/csi-attacher:v2.1.0,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:csi-cephfsplugin,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/cephcsi/cephcsi:v2.1.0,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:csi-provisioner,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/k8scsi/csi-provisioner:v1.4.0,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:csi-resizer,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/k8scsi/csi-resizer:v0.4.0,ImageID:,ContainerID:,Started:*false,},ContainerStatus{Name:liveness-prometheus,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:quay.io/cephcsi/cephcsi:v2.1.0,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 07:47:24.629 [INFO][18501] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" HandleID="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0"
2020-05-05 07:47:24.652 [INFO][18501] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" HandleID="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0"
2020-05-05 07:47:24.652 [INFO][18501] ipam_plugin.go 233: Auto assigning IP ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" HandleID="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000231d40), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"csi-cephfsplugin-provisioner-6944f4676d-l6jzs"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 07:47:24.652 [INFO][18501] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 07:47:24.661 [INFO][18501] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.668 [INFO][18501] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.672 [INFO][18501] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.677 [INFO][18501] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.677 [INFO][18501] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.681 [INFO][18501] ipam.go 1265: Creating new handle: k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee
2020-05-05 07:47:24.687 [INFO][18501] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.702 [INFO][18501] ipam.go 840: Successfully claimed IPs: [10.2.76.3/24] block=10.2.76.0/24 handle="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.702 [INFO][18501] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.3/24] handle="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.706 [INFO][18501] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.3/24] handle="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:47:24.706 [INFO][18501] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.3/24] IPv6=[] ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" HandleID="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0"
2020-05-05 07:47:24.706 [INFO][18501] ipam_plugin.go 261: IPAM Result ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" HandleID="k8s-pod-network.9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc000150cc0)}
2020-05-05 07:47:24.708 [INFO][18487] k8s.go 358: Populated endpoint ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Namespace="rook" Pod="csi-cephfsplugin-provisioner-6944f4676d-l6jzs" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0", GenerateName:"csi-cephfsplugin-provisioner-6944f4676d-", Namespace:"rook", SelfLink:"", UID:"e03c25a1-e894-4016-b65d-5a928e97376c", ResourceVersion:"8195", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261643, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"csi-cephfsplugin-provisioner", "contains":"csi-cephfsplugin-metrics", "pod-template-hash":"6944f4676d", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-csi-cephfs-provisioner-sa"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"csi-cephfsplugin-provisioner-6944f4676d-l6jzs", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-csi-cephfs-provisioner-sa"}, InterfaceName:"cali49faf4b818c", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:47:24.708 [INFO][18487] k8s.go 359: Calico CNI using IPs: [10.2.76.3/32] ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Namespace="rook" Pod="csi-cephfsplugin-provisioner-6944f4676d-l6jzs" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0"
2020-05-05 07:47:24.708 [INFO][18487] network_linux.go 76: Setting the host side veth name to cali49faf4b818c ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Namespace="rook" Pod="csi-cephfsplugin-provisioner-6944f4676d-l6jzs" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0"
2020-05-05 07:47:24.709 [INFO][18487] network_linux.go 396: Disabling IPv4 forwarding ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Namespace="rook" Pod="csi-cephfsplugin-provisioner-6944f4676d-l6jzs" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0"
2020-05-05 07:47:24.726 [INFO][18487] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Namespace="rook" Pod="csi-cephfsplugin-provisioner-6944f4676d-l6jzs" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0", GenerateName:"csi-cephfsplugin-provisioner-6944f4676d-", Namespace:"rook", SelfLink:"", UID:"e03c25a1-e894-4016-b65d-5a928e97376c", ResourceVersion:"8195", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261643, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"csi-cephfsplugin-provisioner", "contains":"csi-cephfsplugin-metrics", "pod-template-hash":"6944f4676d", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-csi-cephfs-provisioner-sa"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee", Pod:"csi-cephfsplugin-provisioner-6944f4676d-l6jzs", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-csi-cephfs-provisioner-sa"}, InterfaceName:"cali49faf4b818c", MAC:"52:b8:aa:65:8a:c4", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:47:24.742 [INFO][18487] k8s.go 417: Wrote updated endpoint to datastore ContainerID="9ebd382433a8986e29bd3dbe35422c82cb62457b8e0709cf0276040cf6e462ee" Namespace="rook" Pod="csi-cephfsplugin-provisioner-6944f4676d-l6jzs" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-csi--cephfsplugin--provisioner--6944f4676d--l6jzs-eth0"
I0505 07:48:06.977681 2827 operation_generator.go:195] parsed scheme: ""
I0505 07:48:06.977696 2827 operation_generator.go:195] scheme "" not registered, fallback to default scheme
I0505 07:48:06.977714 2827 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{/var/lib/kubelet/plugins_registry/rook.rbd.csi.ceph.com-reg.sock <nil> 0 <nil>}] <nil> <nil>}
I0505 07:48:06.977723 2827 clientconn.go:933] ClientConn switching balancer to "pick_first"
I0505 07:48:06.978584 2827 csi_plugin.go:98] kubernetes.io/csi: Trying to validate a new CSI Driver with name: rook.rbd.csi.ceph.com endpoint: /var/lib/kubelet/plugins/rook.rbd.csi.ceph.com/csi.sock versions: 1.0.0
I0505 07:48:06.978622 2827 csi_plugin.go:111] kubernetes.io/csi: Register new plugin with name: rook.rbd.csi.ceph.com at endpoint: /var/lib/kubelet/plugins/rook.rbd.csi.ceph.com/csi.sock
I0505 07:48:06.978646 2827 clientconn.go:106] parsed scheme: ""
I0505 07:48:06.978669 2827 clientconn.go:106] scheme "" not registered, fallback to default scheme
I0505 07:48:06.978706 2827 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{/var/lib/kubelet/plugins/rook.rbd.csi.ceph.com/csi.sock <nil> 0 <nil>}] <nil> <nil>}
I0505 07:48:06.978716 2827 clientconn.go:933] ClientConn switching balancer to "pick_first"
I0505 07:48:06.978742 2827 clientconn.go:882] blockingPicker: the picked transport is not ready, loop back to repick
E0505 07:48:07.001015 2827 nodeinfomanager.go:568] Invalid attach limit value 0 cannot be added to CSINode object for "rook.rbd.csi.ceph.com"
I0505 07:48:07.977870 2827 operation_generator.go:195] parsed scheme: ""
I0505 07:48:07.977886 2827 operation_generator.go:195] scheme "" not registered, fallback to default scheme
I0505 07:48:07.977937 2827 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{/var/lib/kubelet/plugins_registry/rook.cephfs.csi.ceph.com-reg.sock <nil> 0 <nil>}] <nil> <nil>}
I0505 07:48:07.977948 2827 clientconn.go:933] ClientConn switching balancer to "pick_first"
I0505 07:48:07.978803 2827 csi_plugin.go:98] kubernetes.io/csi: Trying to validate a new CSI Driver with name: rook.cephfs.csi.ceph.com endpoint: /var/lib/kubelet/plugins/rook.cephfs.csi.ceph.com/csi.sock versions: 1.0.0
I0505 07:48:07.978840 2827 csi_plugin.go:111] kubernetes.io/csi: Register new plugin with name: rook.cephfs.csi.ceph.com at endpoint: /var/lib/kubelet/plugins/rook.cephfs.csi.ceph.com/csi.sock
I0505 07:48:07.978873 2827 clientconn.go:106] parsed scheme: ""
I0505 07:48:07.978882 2827 clientconn.go:106] scheme "" not registered, fallback to default scheme
I0505 07:48:07.978935 2827 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{/var/lib/kubelet/plugins/rook.cephfs.csi.ceph.com/csi.sock <nil> 0 <nil>}] <nil> <nil>}
I0505 07:48:07.978947 2827 clientconn.go:933] ClientConn switching balancer to "pick_first"
I0505 07:48:07.978989 2827 clientconn.go:882] blockingPicker: the picked transport is not ready, loop back to repick
E0505 07:48:07.999417 2827 nodeinfomanager.go:568] Invalid attach limit value 0 cannot be added to CSINode object for "rook.cephfs.csi.ceph.com"
I0505 07:48:12.252245 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:48:12.364614 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-log") pod "rook-ceph-mon-c-canary-684d9878cf-twts6" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:12.364646 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-crash") pod "rook-ceph-mon-c-canary-684d9878cf-twts6" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:12.364704 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "default-token-cdzd4" (UniqueName: "kubernetes.io/secret/d9da635b-1258-4907-8c10-5f125faa3d9b-default-token-cdzd4") pod "rook-ceph-mon-c-canary-684d9878cf-twts6" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:12.364759 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-config-override") pod "rook-ceph-mon-c-canary-684d9878cf-twts6" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:12.364783 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "ceph-daemon-data" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-ceph-daemon-data") pod "rook-ceph-mon-c-canary-684d9878cf-twts6" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:12.364836 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-mons-keyring" (UniqueName: "kubernetes.io/secret/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-mons-keyring") pod "rook-ceph-mon-c-canary-684d9878cf-twts6" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
2020-05-05 07:48:13.092 [INFO][21178] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-mon-c-canary-684d9878cf-twts6", ContainerID:"48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833"}}
2020-05-05 07:48:13.124 [INFO][21178] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0 rook-ceph-mon-c-canary-684d9878cf- rook d9da635b-1258-4907-8c10-5f125faa3d9b 8682 0 2020-05-05 07:48:12 +0000 UTC <nil> <nil> map[app:rook-ceph-mon ceph_daemon_id:c mon:c mon_canary:true mon_cluster:rook pod-template-hash:684d9878cf projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-mon-c-canary-684d9878cf-twts6 eth0 [] [] [kns.rook ksa.rook.default] cali6e7c0d49c1c [{client TCP 6789}]}} ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Namespace="rook" Pod="rook-ceph-mon-c-canary-684d9878cf-twts6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-"
2020-05-05 07:48:13.124 [INFO][21178] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Namespace="rook" Pod="rook-ceph-mon-c-canary-684d9878cf-twts6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:13.131 [INFO][21178] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 07:48:13.135 [INFO][21178] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-mon-c-canary-684d9878cf-twts6 rook-ceph-mon-c-canary-684d9878cf- rook /api/v1/namespaces/rook/pods/rook-ceph-mon-c-canary-684d9878cf-twts6 d9da635b-1258-4907-8c10-5f125faa3d9b 8682 0 2020-05-05 07:48:12 +0000 UTC <nil> <nil> map[app:rook-ceph-mon ceph_daemon_id:c mon:c mon_canary:true mon_cluster:rook pod-template-hash:684d9878cf rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-mon-c-canary-684d9878cf 9bd50c18-b456-4e1c-9849-ca9d617cb25a 0xc0003e4817 0xc0003e4818}] [] [{kube-controller-manager Update v1 2020-05-05 07:48:12 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 103 101 110 101 114 97 116 101 78 97 109 101 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 112 34 58 123 125 44 34 102 58 99 101 112 104 95 100 97 101 109 111 110 95 105 100 34 58 123 125 44 34 102 58 109 111 110 34 58 123 125 44 34 102 58 109 111 110 95 99 97 110 97 114 121 34 58 123 125 44 34 102 58 109 111 110 95 99 108 117 115 116 101 114 34 58 123 125 44 34 102 58 112 111 100 45 116 101 109 112 108 97 116 101 45 104 97 115 104 34 58 123 125 44 34 102 58 114 111 111 107 95 99 108 117 115 116 101 114 34 58 123 125 125 44 34 102 58 111 119 110 101 114 82 101 102 101 114 101 110 99 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 117 105 100 92 34 58 92 34 57 98 100 53 48 99 49 56 45 98 52 53 54 45 52 101 49 99 45 57 56 52 57 45 99 97 57 100 54 49 55 99 98 50 53 97 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 98 108 111 99 107 79 119 110 101 114 68 101 108 101 116 105 111 110 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 34 58 123 125 44 34 102 58 107 105 110 100 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 117 105 100 34 58 123 125 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 97 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 65 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 113 117 105 114 101 100 68 117 114 105 110 103 83 99 104 101 100 117 108 105 110 103 73 103 110 111 114 101 100 68 117 114 105 110 103 69 120 101 99 117 116 105 111 110 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 84 101 114 109 115 34 58 123 125 125 125 44 34 102 58 112 111 100 65 110 116 105 65 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 113 117 105 114 101 100 68 117 114 105 110 103 83 99 104 101 100 117 108 105 110 103 73 103 110 111 114 101 100 68 117 114 105 110 103 69 120 101 99 117 116 105 111 110 34 58 123 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 109 111 110 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 79 78 84 65 73 78 69 82 95 73 77 65 71 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 78 79 68 69 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 67 80 85 95 76 73 77 73 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 67 80 85 95 82 69 81 85 69 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 77 69 77 79 82 89 95 76 73 77 73 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 77 69 77 79 82 89 95 82 69 81 85 69 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 83 80 65 67 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 77 79 78 95 72 79 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 77 79 78 95 73 78 73 84 73 65 76 95 77 69 77 66 69 82 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 80 79 68 95 73 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 54 55 56 57 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 34 46 34 58 123 125 44 34 102 58 112 114 105 118 105 108 101 103 101 100 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 47 107 101 121 114 105 110 103 45 115 116 111 114 101 47 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 109 111 110 47 99 101 112 104 45 99 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 111 103 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 101 110 97 98 108 101 83 101 114 118 105 99 101 76 105 110 107 115 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 116 111 108 101 114 97 116 105 111 110 115 34 58 123 125 44 34 102 58 118 111 108 117 109 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 101 112 104 45 100 97 101 109 111 110 45 100 97 116 97 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 108 111 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 109 111 110 115 45 107 101 121 114 105 110 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 115 101 99 114 101 116 34 58 123 34 46 34 58 123 125 44 34 102 58 100 101 102 97 117 108 116 77 111 100 101 34 58 123 125 44 34 102 58 115 101 99 114 101 116 78 97 109 101 34 58 123 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 111 110 102 105 103 45 111 118 101 114 114 105 100 101 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 110 102 105 103 77 97 112 34 58 123 34 46 34 58 123 125 44 34 102 58 100 101 102 97 117 108 116 77 111 100 101 34 58 123 125 44 34 102 58 105 116 101 109 115 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125],}} {kubelet Update v1 2020-05-05 07:48:12 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 99 111 110 100 105 116 105 111 110 115 34 58 123 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 67 111 110 116 97 105 110 101 114 115 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 73 110 105 116 105 97 108 105 122 101 100 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 104 111 115 116 73 80 34 58 123 125 44 34 102 58 115 116 97 114 116 84 105 109 101 34 58 123 125 125 125],}}]},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-mons-keyring,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-mons-keyring,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:ceph-daemon-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/mon-c/data,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:default-token-cdzd4,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:default-token-cdzd4,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:mon,Image:rook/ceph:v1.3.2,Command:[/tini],Args:[-- sleep 3600],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:client,HostPort:0,ContainerPort:6789,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:ROOK_CEPH_MON_INITIAL_MEMBERS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_initial_members,Optional:nil,},},},EnvVar{Name:ROOK_POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-mons-keyring,ReadOnly:true,MountPath:/etc/ceph/keyring-store/,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:ceph-daemon-data,ReadOnly:false,MountPath:/var/lib/ceph/mon/ceph-c,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:default-token-cdzd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*false,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{},ServiceAccountName:default,DeprecatedServiceAccount:default,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:false,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:&PodAntiAffinity{RequiredDuringSchedulingIgnoredDuringExecution:[]PodAffinityTerm{PodAffinityTerm{LabelSelector:&v1.LabelSelector{MatchLabels:map[string]string{app: rook-ceph-mon,},MatchExpressions:[]LabelSelectorRequirement{},},Namespaces:[],TopologyKey:kubernetes.io/hostname,},},PreferredDuringSchedulingIgnoredDuringExecution:[]WeightedPodAffinityTerm{},},},SchedulerName:default-scheduler,InitContainers:[]Container{},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:12 +0000 UTC,Reason:,Message:,},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:12 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [mon],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:12 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [mon],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:12 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 07:48:12 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:mon,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:ContainerCreating,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:rook/ceph:v1.3.2,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 07:48:13.151 [INFO][21194] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" HandleID="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:13.173 [INFO][21194] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833 ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" HandleID="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:13.174 [INFO][21194] ipam_plugin.go 233: Auto assigning IP ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" HandleID="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004ab140), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-mon-c-canary-684d9878cf-twts6"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 07:48:13.174 [INFO][21194] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 07:48:13.182 [INFO][21194] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.189 [INFO][21194] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.194 [INFO][21194] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.199 [INFO][21194] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.199 [INFO][21194] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.203 [INFO][21194] ipam.go 1265: Creating new handle: k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833
2020-05-05 07:48:13.209 [INFO][21194] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.220 [INFO][21194] ipam.go 840: Successfully claimed IPs: [10.2.76.4/24] block=10.2.76.0/24 handle="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.221 [INFO][21194] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.4/24] handle="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.225 [INFO][21194] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.4/24] handle="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:13.225 [INFO][21194] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.4/24] IPv6=[] ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" HandleID="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:13.225 [INFO][21194] ipam_plugin.go 261: IPAM Result ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" HandleID="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc00040a1e0)}
2020-05-05 07:48:13.226 [INFO][21178] k8s.go 358: Populated endpoint ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Namespace="rook" Pod="rook-ceph-mon-c-canary-684d9878cf-twts6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0", GenerateName:"rook-ceph-mon-c-canary-684d9878cf-", Namespace:"rook", SelfLink:"", UID:"d9da635b-1258-4907-8c10-5f125faa3d9b", ResourceVersion:"8682", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261692, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-mon", "ceph_daemon_id":"c", "mon":"c", "mon_canary":"true", "mon_cluster":"rook", "pod-template-hash":"684d9878cf", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-mon-c-canary-684d9878cf-twts6", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.default"}, InterfaceName:"cali6e7c0d49c1c", MAC:"", Ports:[]v3.EndpointPort{v3.EndpointPort{Name:"client", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1a85}}}}
2020-05-05 07:48:13.226 [INFO][21178] k8s.go 359: Calico CNI using IPs: [10.2.76.4/32] ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Namespace="rook" Pod="rook-ceph-mon-c-canary-684d9878cf-twts6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:13.226 [INFO][21178] network_linux.go 76: Setting the host side veth name to cali6e7c0d49c1c ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Namespace="rook" Pod="rook-ceph-mon-c-canary-684d9878cf-twts6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:13.227 [INFO][21178] network_linux.go 396: Disabling IPv4 forwarding ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Namespace="rook" Pod="rook-ceph-mon-c-canary-684d9878cf-twts6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:13.242 [INFO][21178] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Namespace="rook" Pod="rook-ceph-mon-c-canary-684d9878cf-twts6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0", GenerateName:"rook-ceph-mon-c-canary-684d9878cf-", Namespace:"rook", SelfLink:"", UID:"d9da635b-1258-4907-8c10-5f125faa3d9b", ResourceVersion:"8682", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261692, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-mon", "ceph_daemon_id":"c", "mon":"c", "mon_canary":"true", "mon_cluster":"rook", "pod-template-hash":"684d9878cf", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833", Pod:"rook-ceph-mon-c-canary-684d9878cf-twts6", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.default"}, InterfaceName:"cali6e7c0d49c1c", MAC:"1e:72:1c:88:96:eb", Ports:[]v3.EndpointPort{v3.EndpointPort{Name:"client", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1a85}}}}
2020-05-05 07:48:13.255 [INFO][21178] k8s.go 417: Wrote updated endpoint to datastore ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Namespace="rook" Pod="rook-ceph-mon-c-canary-684d9878cf-twts6" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
I0505 07:48:13.268424 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:48:13.268564 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:48:13.268694 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:48:13.268781 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
2020-05-05 07:48:21.102 [INFO][21585] plugin.go 496: Extracted identifiers ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Node="suraj-lk-cluster-pool-1-worker-2" Orchestrator="k8s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:21.111 [WARNING][21585] workloadendpoint.go 77: Operation Delete is not supported on WorkloadEndpoint type
2020-05-05 07:48:21.111 [INFO][21585] k8s.go 467: Endpoint deletion will be handled by Kubernetes deletion of the Pod. ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0", GenerateName:"rook-ceph-mon-c-canary-684d9878cf-", Namespace:"rook", SelfLink:"", UID:"d9da635b-1258-4907-8c10-5f125faa3d9b", ResourceVersion:"8820", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261692, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-mon", "ceph_daemon_id":"c", "mon":"c", "mon_canary":"true", "mon_cluster":"rook", "pod-template-hash":"684d9878cf", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-mon-c-canary-684d9878cf-twts6", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.default"}, InterfaceName:"cali6e7c0d49c1c", MAC:"", Ports:[]v3.EndpointPort{v3.EndpointPort{Name:"client", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1a85}}}}
2020-05-05 07:48:21.111 [INFO][21585] k8s.go 474: Cleaning up netns ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833"
2020-05-05 07:48:21.111 [INFO][21585] network_linux.go 450: Calico CNI deleting device in netns /proc/21119/ns/net ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833"
2020-05-05 07:48:21.134 [INFO][21585] network_linux.go 467: Calico CNI deleted device in netns /proc/21119/ns/net ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833"
2020-05-05 07:48:21.134 [INFO][21585] k8s.go 481: Releasing IP address(es) ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833"
2020-05-05 07:48:21.134 [INFO][21585] utils.go 168: Calico CNI releasing IP address ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833"
2020-05-05 07:48:21.151 [INFO][21600] ipam_plugin.go 302: Releasing address using handleID ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" HandleID="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:21.152 [INFO][21600] ipam.go 1166: Releasing all IPs with handle 'k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833'
2020-05-05 07:48:21.217 [INFO][21600] ipam_plugin.go 314: Released address using handleID ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" HandleID="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:21.217 [INFO][21600] ipam_plugin.go 323: Releasing address using workloadID ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" HandleID="k8s-pod-network.48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--canary--684d9878cf--twts6-eth0"
2020-05-05 07:48:21.217 [INFO][21600] ipam.go 1166: Releasing all IPs with handle 'rook.rook-ceph-mon-c-canary-684d9878cf-twts6'
2020-05-05 07:48:21.221 [INFO][21585] k8s.go 487: Teardown processing complete. ContainerID="48990cd8c7c6ddad749e103281e01b4d0e7982aab91d951e59da3712ebd61833"
I0505 07:48:21.774442 2827 topology_manager.go:219] [topologymanager] RemoveContainer - Container ID: d064f5047741e0096e9eb130328f8dbf6d7ea914433b4b8abb91670e8889ae37
I0505 07:48:21.791319 2827 topology_manager.go:219] [topologymanager] RemoveContainer - Container ID: d064f5047741e0096e9eb130328f8dbf6d7ea914433b4b8abb91670e8889ae37
E0505 07:48:21.791850 2827 remote_runtime.go:295] ContainerStatus "d064f5047741e0096e9eb130328f8dbf6d7ea914433b4b8abb91670e8889ae37" from runtime service failed: rpc error: code = Unknown desc = Error: No such container: d064f5047741e0096e9eb130328f8dbf6d7ea914433b4b8abb91670e8889ae37
I0505 07:48:21.890550 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "ceph-daemon-data" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-ceph-daemon-data") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:21.890617 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-config-override") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:21.890705 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "default-token-cdzd4" (UniqueName: "kubernetes.io/secret/d9da635b-1258-4907-8c10-5f125faa3d9b-default-token-cdzd4") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:21.890714 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-ceph-daemon-data" (OuterVolumeSpecName: "ceph-daemon-data") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b"). InnerVolumeSpecName "ceph-daemon-data". PluginName "kubernetes.io/host-path", VolumeGidValue ""
I0505 07:48:21.890755 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-crash") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:21.890812 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-log") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:21.890825 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-crash" (OuterVolumeSpecName: "rook-ceph-crash") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b"). InnerVolumeSpecName "rook-ceph-crash". PluginName "kubernetes.io/host-path", VolumeGidValue ""
I0505 07:48:21.890849 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-ceph-mons-keyring" (UniqueName: "kubernetes.io/secret/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-mons-keyring") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b")
I0505 07:48:21.890883 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-log" (OuterVolumeSpecName: "rook-ceph-log") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b"). InnerVolumeSpecName "rook-ceph-log". PluginName "kubernetes.io/host-path", VolumeGidValue ""
W0505 07:48:21.890908 2827 empty_dir.go:453] Warning: Failed to clear quota on /var/lib/kubelet/pods/d9da635b-1258-4907-8c10-5f125faa3d9b/volumes/kubernetes.io~configmap/rook-config-override: ClearQuota called, but quotas disabled
I0505 07:48:21.890930 2827 reconciler.go:319] Volume detached for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-crash") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:48:21.890944 2827 reconciler.go:319] Volume detached for volume "ceph-daemon-data" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-ceph-daemon-data") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:48:21.891258 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-config-override" (OuterVolumeSpecName: "rook-config-override") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b"). InnerVolumeSpecName "rook-config-override". PluginName "kubernetes.io/configmap", VolumeGidValue ""
I0505 07:48:21.904090 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-mons-keyring" (OuterVolumeSpecName: "rook-ceph-mons-keyring") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b"). InnerVolumeSpecName "rook-ceph-mons-keyring". PluginName "kubernetes.io/secret", VolumeGidValue ""
I0505 07:48:21.906019 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9da635b-1258-4907-8c10-5f125faa3d9b-default-token-cdzd4" (OuterVolumeSpecName: "default-token-cdzd4") pod "d9da635b-1258-4907-8c10-5f125faa3d9b" (UID: "d9da635b-1258-4907-8c10-5f125faa3d9b"). InnerVolumeSpecName "default-token-cdzd4". PluginName "kubernetes.io/secret", VolumeGidValue ""
I0505 07:48:21.991294 2827 reconciler.go:319] Volume detached for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-log") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:48:21.991345 2827 reconciler.go:319] Volume detached for volume "rook-ceph-mons-keyring" (UniqueName: "kubernetes.io/secret/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-ceph-mons-keyring") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:48:21.991355 2827 reconciler.go:319] Volume detached for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/d9da635b-1258-4907-8c10-5f125faa3d9b-rook-config-override") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:48:21.991369 2827 reconciler.go:319] Volume detached for volume "default-token-cdzd4" (UniqueName: "kubernetes.io/secret/d9da635b-1258-4907-8c10-5f125faa3d9b-default-token-cdzd4") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:48:59.030302 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:48:59.091964 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-mons-keyring" (UniqueName: "kubernetes.io/secret/d355b8e5-e08a-42e9-8b5a-32bd3229c666-rook-ceph-mons-keyring") pod "rook-ceph-mon-c-cf9d698b5-g88b9" (UID: "d355b8e5-e08a-42e9-8b5a-32bd3229c666")
I0505 07:48:59.092067 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/d355b8e5-e08a-42e9-8b5a-32bd3229c666-rook-config-override") pod "rook-ceph-mon-c-cf9d698b5-g88b9" (UID: "d355b8e5-e08a-42e9-8b5a-32bd3229c666")
I0505 07:48:59.092147 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "ceph-daemon-data" (UniqueName: "kubernetes.io/host-path/d355b8e5-e08a-42e9-8b5a-32bd3229c666-ceph-daemon-data") pod "rook-ceph-mon-c-cf9d698b5-g88b9" (UID: "d355b8e5-e08a-42e9-8b5a-32bd3229c666")
I0505 07:48:59.092197 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "default-token-cdzd4" (UniqueName: "kubernetes.io/secret/d355b8e5-e08a-42e9-8b5a-32bd3229c666-default-token-cdzd4") pod "rook-ceph-mon-c-cf9d698b5-g88b9" (UID: "d355b8e5-e08a-42e9-8b5a-32bd3229c666")
I0505 07:48:59.092302 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/d355b8e5-e08a-42e9-8b5a-32bd3229c666-rook-ceph-crash") pod "rook-ceph-mon-c-cf9d698b5-g88b9" (UID: "d355b8e5-e08a-42e9-8b5a-32bd3229c666")
I0505 07:48:59.092345 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/d355b8e5-e08a-42e9-8b5a-32bd3229c666-rook-ceph-log") pod "rook-ceph-mon-c-cf9d698b5-g88b9" (UID: "d355b8e5-e08a-42e9-8b5a-32bd3229c666")
I0505 07:48:59.111916 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:48:59.192614 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash-collector-keyring" (UniqueName: "kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring") pod "rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" (UID: "889e0bea-1099-49a7-a960-568cc6f0298a")
I0505 07:48:59.192713 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/889e0bea-1099-49a7-a960-568cc6f0298a-rook-config-override") pod "rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" (UID: "889e0bea-1099-49a7-a960-568cc6f0298a")
I0505 07:48:59.192786 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "default-token-cdzd4" (UniqueName: "kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-default-token-cdzd4") pod "rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" (UID: "889e0bea-1099-49a7-a960-568cc6f0298a")
I0505 07:48:59.192812 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-log") pod "rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" (UID: "889e0bea-1099-49a7-a960-568cc6f0298a")
I0505 07:48:59.192930 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash") pod "rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" (UID: "889e0bea-1099-49a7-a960-568cc6f0298a")
E0505 07:48:59.294142 2827 secret.go:195] Couldn't get secret rook/rook-ceph-crash-collector-keyring: secret "rook-ceph-crash-collector-keyring" not found
E0505 07:48:59.294260 2827 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring podName:889e0bea-1099-49a7-a960-568cc6f0298a nodeName:}" failed. No retries permitted until 2020-05-05 07:48:59.794227586 +0000 UTC m=+2613.859801762 (durationBeforeRetry 500ms). Error: "MountVolume.SetUp failed for volume \"rook-ceph-crash-collector-keyring\" (UniqueName: \"kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring\") pod \"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l\" (UID: \"889e0bea-1099-49a7-a960-568cc6f0298a\") : secret \"rook-ceph-crash-collector-keyring\" not found"
E0505 07:48:59.795776 2827 secret.go:195] Couldn't get secret rook/rook-ceph-crash-collector-keyring: secret "rook-ceph-crash-collector-keyring" not found
E0505 07:48:59.795936 2827 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring podName:889e0bea-1099-49a7-a960-568cc6f0298a nodeName:}" failed. No retries permitted until 2020-05-05 07:49:00.795881953 +0000 UTC m=+2614.861456118 (durationBeforeRetry 1s). Error: "MountVolume.SetUp failed for volume \"rook-ceph-crash-collector-keyring\" (UniqueName: \"kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring\") pod \"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l\" (UID: \"889e0bea-1099-49a7-a960-568cc6f0298a\") : secret \"rook-ceph-crash-collector-keyring\" not found"
2020-05-05 07:48:59.843 [INFO][22729] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-mon-c-cf9d698b5-g88b9", ContainerID:"512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49"}}
2020-05-05 07:48:59.875 [INFO][22729] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0 rook-ceph-mon-c-cf9d698b5- rook d355b8e5-e08a-42e9-8b5a-32bd3229c666 9182 0 2020-05-05 07:48:59 +0000 UTC <nil> <nil> map[app:rook-ceph-mon ceph_daemon_id:c mon:c mon_cluster:rook pod-template-hash:cf9d698b5 projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-mon-c-cf9d698b5-g88b9 eth0 [] [] [kns.rook ksa.rook.default] calif3700263369 [{client TCP 6789}]}} ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Namespace="rook" Pod="rook-ceph-mon-c-cf9d698b5-g88b9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-"
2020-05-05 07:48:59.875 [INFO][22729] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Namespace="rook" Pod="rook-ceph-mon-c-cf9d698b5-g88b9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0"
2020-05-05 07:48:59.881 [INFO][22729] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 07:48:59.886 [INFO][22729] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-mon-c-cf9d698b5-g88b9 rook-ceph-mon-c-cf9d698b5- rook /api/v1/namespaces/rook/pods/rook-ceph-mon-c-cf9d698b5-g88b9 d355b8e5-e08a-42e9-8b5a-32bd3229c666 9182 0 2020-05-05 07:48:59 +0000 UTC <nil> <nil> map[app:rook-ceph-mon ceph_daemon_id:c mon:c mon_cluster:rook pod-template-hash:cf9d698b5 rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-mon-c-cf9d698b5 2eb016ac-00f1-49f6-b1e2-74047dcf9aa0 0xc0003d83c7 0xc0003d83c8}] [] [{kube-controller-manager Update v1 2020-05-05 07:48:59 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 103 101 110 101 114 97 116 101 78 97 109 101 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 112 34 58 123 125 44 34 102 58 99 101 112 104 95 100 97 101 109 111 110 95 105 100 34 58 123 125 44 34 102 58 109 111 110 34 58 123 125 44 34 102 58 109 111 110 95 99 108 117 115 116 101 114 34 58 123 125 44 34 102 58 112 111 100 45 116 101 109 112 108 97 116 101 45 104 97 115 104 34 58 123 125 44 34 102 58 114 111 111 107 95 99 108 117 115 116 101 114 34 58 123 125 125 44 34 102 58 111 119 110 101 114 82 101 102 101 114 101 110 99 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 117 105 100 92 34 58 92 34 50 101 98 48 49 54 97 99 45 48 48 102 49 45 52 57 102 54 45 98 49 101 50 45 55 52 48 52 55 100 99 102 57 97 97 48 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 98 108 111 99 107 79 119 110 101 114 68 101 108 101 116 105 111 110 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 34 58 123 125 44 34 102 58 107 105 110 100 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 117 105 100 34 58 123 125 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 97 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 65 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 113 117 105 114 101 100 68 117 114 105 110 103 83 99 104 101 100 117 108 105 110 103 73 103 110 111 114 101 100 68 117 114 105 110 103 69 120 101 99 117 116 105 111 110 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 84 101 114 109 115 34 58 123 125 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 109 111 110 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 79 78 84 65 73 78 69 82 95 73 77 65 71 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 78 79 68 69 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 67 80 85 95 76 73 77 73 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 67 80 85 95 82 69 81 85 69 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 77 69 77 79 82 89 95 76 73 77 73 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 77 69 77 79 82 89 95 82 69 81 85 69 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 83 80 65 67 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 77 79 78 95 72 79 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 77 79 78 95 73 78 73 84 73 65 76 95 77 69 77 66 69 82 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 80 79 68 95 73 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 108 105 118 101 110 101 115 115 80 114 111 98 101 34 58 123 34 46 34 58 123 125 44 34 102 58 101 120 101 99 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 125 44 34 102 58 102 97 105 108 117 114 101 84 104 114 101 115 104 111 108 100 34 58 123 125 44 34 102 58 105 110 105 116 105 97 108 68 101 108 97 121 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 112 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 115 117 99 99 101 115 115 84 104 114 101 115 104 111 108 100 34 58 123 125 44 34 102 58 116 105 109 101 111 117 116 83 101 99 111 110 100 115 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 54 55 56 57 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 34 46 34 58 123 125 44 34 102 58 112 114 105 118 105 108 101 103 101 100 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 47 107 101 121 114 105 110 103 45 115 116 111 114 101 47 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 109 111 110 47 99 101 112 104 45 99 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 111 103 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 101 110 97 98 108 101 83 101 114 118 105 99 101 76 105 110 107 115 34 58 123 125 44 34 102 58 105 110 105 116 67 111 110 116 97 105 110 101 114 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 104 111 119 110 45 99 111 110 116 97 105 110 101 114 45 100 97 116 97 45 100 105 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 34 46 34 58 123 125 44 34 102 58 112 114 105 118 105 108 101 103 101 100 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 47 107 101 121 114 105 110 103 45 115 116 111 114 101 47 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 109 111 110 47 99 101 112 104 45 99 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 111 103 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 105 110 105 116 45 109 111 110 45 102 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 79 78 84 65 73 78 69 82 95 73 77 65 71 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 78 79 68 69 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 67 80 85 95 76 73 77 73 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 67 80 85 95 82 69 81 85 69 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 77 69 77 79 82 89 95 76 73 77 73 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 77 69 77 79 82 89 95 82 69 81 85 69 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 83 80 65 67 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 77 79 78 95 72 79 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 77 79 78 95 73 78 73 84 73 65 76 95 77 69 77 66 69 82 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 34 46 34 58 123 125 44 34 102 58 112 114 105 118 105 108 101 103 101 100 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 47 107 101 121 114 105 110 103 45 115 116 111 114 101 47 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 109 111 110 47 99 101 112 104 45 99 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 111 103 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 34 58 123 34 46 34 58 123 125 44 34 102 58 107 117 98 101 114 110 101 116 101 115 46 105 111 47 104 111 115 116 110 97 109 101 34 58 123 125 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 116 111 108 101 114 97 116 105 111 110 115 34 58 123 125 44 34 102 58 118 111 108 117 109 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 101 112 104 45 100 97 101 109 111 110 45 100 97 116 97 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 108 111 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 109 111 110 115 45 107 101 121 114 105 110 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 115 101 99 114 101 116 34 58 123 34 46 34 58 123 125 44 34 102 58 100 101 102 97 117 108 116 77 111 100 101 34 58 123 125 44 34 102 58 115 101 99 114 101 116 78 97 109 101 34 58 123 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 111 110 102 105 103 45 111 118 101 114 114 105 100 101 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 110 102 105 103 77 97 112 34 58 123 34 46 34 58 123 125 44 34 102 58 100 101 102 97 117 108 116 77 111 100 101 34 58 123 125 44 34 102 58 105 116 101 109 115 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125],}} {kubelet Update v1 2020-05-05 07:48:59 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 99 111 110 100 105 116 105 111 110 115 34 58 123 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 67 111 110 116 97 105 110 101 114 115 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 73 110 105 116 105 97 108 105 122 101 100 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 104 111 115 116 73 80 34 58 123 125 44 34 102 58 105 110 105 116 67 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 115 116 97 114 116 84 105 109 101 34 58 123 125 125 125],}}]},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-mons-keyring,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-mons-keyring,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:ceph-daemon-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/mon-c/data,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:default-token-cdzd4,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:default-token-cdzd4,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:mon,Image:ceph/ceph:v14.2.8,Command:[ceph-mon],Args:[--fsid=504a40d4-a2d3-4492-a37d-27a58464958f --keyring=/etc/ceph/keyring-store/keyring --log-to-stderr=true --err-to-stderr=true --mon-cluster-log-to-stderr=true --log-stderr-prefix=debug --default-log-to-file=false --default-mon-cluster-log-to-file=false --mon-host=$(ROOK_CEPH_MON_HOST) --mon-initial-members=$(ROOK_CEPH_MON_INITIAL_MEMBERS) --id=c --setuser=ceph --setgroup=ceph --foreground --public-addr=10.3.44.127 --setuser-match-path=/var/lib/ceph/mon/ceph-c/store.db --public-bind-addr=$(ROOK_POD_IP)],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:client,HostPort:0,ContainerPort:6789,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:ROOK_CEPH_MON_INITIAL_MEMBERS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_initial_members,Optional:nil,},},},EnvVar{Name:ROOK_POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-mons-keyring,ReadOnly:true,MountPath:/etc/ceph/keyring-store/,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:ceph-daemon-data,ReadOnly:false,MountPath:/var/lib/ceph/mon/ceph-c,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:default-token-cdzd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-mon.c.asok mon_status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*false,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:default,DeprecatedServiceAccount:default,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:false,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash /var/lib/ceph/mon/ceph-c],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-mons-keyring,ReadOnly:true,MountPath:/etc/ceph/keyring-store/,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:ceph-daemon-data,ReadOnly:false,MountPath:/var/lib/ceph/mon/ceph-c,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:default-token-cdzd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*false,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:init-mon-fs,Image:ceph/ceph:v14.2.8,Command:[ceph-mon],Args:[--fsid=504a40d4-a2d3-4492-a37d-27a58464958f --keyring=/etc/ceph/keyring-store/keyring --log-to-stderr=true --err-to-stderr=true --mon-cluster-log-to-stderr=true --log-stderr-prefix=debug --default-log-to-file=false --default-mon-cluster-log-to-file=false --mon-host=$(ROOK_CEPH_MON_HOST) --mon-initial-members=$(ROOK_CEPH_MON_INITIAL_MEMBERS) --id=c --setuser=ceph --setgroup=ceph --public-addr=10.3.44.127 --mkfs],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:ROOK_CEPH_MON_INITIAL_MEMBERS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_initial_members,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-mons-keyring,ReadOnly:true,MountPath:/etc/ceph/keyring-store/,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:ceph-daemon-data,ReadOnly:false,MountPath:/var/lib/ceph/mon/ceph-c,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:default-token-cdzd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*false,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:59 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [chown-container-data-dir init-mon-fs],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:59 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [mon],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:59 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [mon],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:59 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 07:48:59 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:mon,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:init-mon-fs,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 07:48:59.901 [INFO][22745] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" HandleID="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0"
2020-05-05 07:48:59.924 [INFO][22745] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49 ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" HandleID="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0"
2020-05-05 07:48:59.924 [INFO][22745] ipam_plugin.go 233: Auto assigning IP ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" HandleID="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002b6d50), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-mon-c-cf9d698b5-g88b9"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 07:48:59.924 [INFO][22745] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 07:48:59.933 [INFO][22745] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.940 [INFO][22745] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.944 [INFO][22745] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.949 [INFO][22745] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.949 [INFO][22745] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.953 [INFO][22745] ipam.go 1265: Creating new handle: k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49
2020-05-05 07:48:59.959 [INFO][22745] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.969 [INFO][22745] ipam.go 840: Successfully claimed IPs: [10.2.76.5/24] block=10.2.76.0/24 handle="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.969 [INFO][22745] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.5/24] handle="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.972 [INFO][22745] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.5/24] handle="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:48:59.972 [INFO][22745] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.5/24] IPv6=[] ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" HandleID="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0"
2020-05-05 07:48:59.972 [INFO][22745] ipam_plugin.go 261: IPAM Result ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" HandleID="k8s-pod-network.512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc000108a80)}
2020-05-05 07:48:59.973 [INFO][22729] k8s.go 358: Populated endpoint ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Namespace="rook" Pod="rook-ceph-mon-c-cf9d698b5-g88b9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0", GenerateName:"rook-ceph-mon-c-cf9d698b5-", Namespace:"rook", SelfLink:"", UID:"d355b8e5-e08a-42e9-8b5a-32bd3229c666", ResourceVersion:"9182", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261739, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-mon", "ceph_daemon_id":"c", "mon":"c", "mon_cluster":"rook", "pod-template-hash":"cf9d698b5", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-mon-c-cf9d698b5-g88b9", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.default"}, InterfaceName:"calif3700263369", MAC:"", Ports:[]v3.EndpointPort{v3.EndpointPort{Name:"client", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1a85}}}}
2020-05-05 07:48:59.974 [INFO][22729] k8s.go 359: Calico CNI using IPs: [10.2.76.5/32] ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Namespace="rook" Pod="rook-ceph-mon-c-cf9d698b5-g88b9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0"
2020-05-05 07:48:59.974 [INFO][22729] network_linux.go 76: Setting the host side veth name to calif3700263369 ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Namespace="rook" Pod="rook-ceph-mon-c-cf9d698b5-g88b9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0"
2020-05-05 07:48:59.975 [INFO][22729] network_linux.go 396: Disabling IPv4 forwarding ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Namespace="rook" Pod="rook-ceph-mon-c-cf9d698b5-g88b9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0"
2020-05-05 07:48:59.986 [INFO][22729] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Namespace="rook" Pod="rook-ceph-mon-c-cf9d698b5-g88b9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0", GenerateName:"rook-ceph-mon-c-cf9d698b5-", Namespace:"rook", SelfLink:"", UID:"d355b8e5-e08a-42e9-8b5a-32bd3229c666", ResourceVersion:"9182", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261739, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-mon", "ceph_daemon_id":"c", "mon":"c", "mon_cluster":"rook", "pod-template-hash":"cf9d698b5", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49", Pod:"rook-ceph-mon-c-cf9d698b5-g88b9", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.default"}, InterfaceName:"calif3700263369", MAC:"1e:9e:18:71:4b:cb", Ports:[]v3.EndpointPort{v3.EndpointPort{Name:"client", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1a85}}}}
2020-05-05 07:49:00.003 [INFO][22729] k8s.go 417: Wrote updated endpoint to datastore ContainerID="512890356b8dd690e46b6d9f68a50fac3b04a96ca2024bb48da5f608ca27ea49" Namespace="rook" Pod="rook-ceph-mon-c-cf9d698b5-g88b9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--mon--c--cf9d698b5--g88b9-eth0"
E0505 07:49:00.799586 2827 secret.go:195] Couldn't get secret rook/rook-ceph-crash-collector-keyring: secret "rook-ceph-crash-collector-keyring" not found
E0505 07:49:00.799702 2827 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring podName:889e0bea-1099-49a7-a960-568cc6f0298a nodeName:}" failed. No retries permitted until 2020-05-05 07:49:02.799657265 +0000 UTC m=+2616.865231423 (durationBeforeRetry 2s). Error: "MountVolume.SetUp failed for volume \"rook-ceph-crash-collector-keyring\" (UniqueName: \"kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring\") pod \"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l\" (UID: \"889e0bea-1099-49a7-a960-568cc6f0298a\") : secret \"rook-ceph-crash-collector-keyring\" not found"
I0505 07:49:02.250428 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:02.250534 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:02.250594 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:02.250634 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
E0505 07:49:02.806318 2827 secret.go:195] Couldn't get secret rook/rook-ceph-crash-collector-keyring: secret "rook-ceph-crash-collector-keyring" not found
E0505 07:49:02.806414 2827 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring podName:889e0bea-1099-49a7-a960-568cc6f0298a nodeName:}" failed. No retries permitted until 2020-05-05 07:49:06.806382639 +0000 UTC m=+2620.871956811 (durationBeforeRetry 4s). Error: "MountVolume.SetUp failed for volume \"rook-ceph-crash-collector-keyring\" (UniqueName: \"kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring\") pod \"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l\" (UID: \"889e0bea-1099-49a7-a960-568cc6f0298a\") : secret \"rook-ceph-crash-collector-keyring\" not found"
I0505 07:49:03.276381 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:03.276500 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:03.276566 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:03.276615 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
E0505 07:49:06.821936 2827 secret.go:195] Couldn't get secret rook/rook-ceph-crash-collector-keyring: secret "rook-ceph-crash-collector-keyring" not found
E0505 07:49:06.822074 2827 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring podName:889e0bea-1099-49a7-a960-568cc6f0298a nodeName:}" failed. No retries permitted until 2020-05-05 07:49:14.822025146 +0000 UTC m=+2628.887599313 (durationBeforeRetry 8s). Error: "MountVolume.SetUp failed for volume \"rook-ceph-crash-collector-keyring\" (UniqueName: \"kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring\") pod \"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l\" (UID: \"889e0bea-1099-49a7-a960-568cc6f0298a\") : secret \"rook-ceph-crash-collector-keyring\" not found"
E0505 07:49:14.852184 2827 secret.go:195] Couldn't get secret rook/rook-ceph-crash-collector-keyring: secret "rook-ceph-crash-collector-keyring" not found
E0505 07:49:14.852303 2827 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring podName:889e0bea-1099-49a7-a960-568cc6f0298a nodeName:}" failed. No retries permitted until 2020-05-05 07:49:30.852248958 +0000 UTC m=+2644.917823122 (durationBeforeRetry 16s). Error: "MountVolume.SetUp failed for volume \"rook-ceph-crash-collector-keyring\" (UniqueName: \"kubernetes.io/secret/889e0bea-1099-49a7-a960-568cc6f0298a-rook-ceph-crash-collector-keyring\") pod \"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l\" (UID: \"889e0bea-1099-49a7-a960-568cc6f0298a\") : secret \"rook-ceph-crash-collector-keyring\" not found"
I0505 07:49:27.284185 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 07:49:27.397379 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "ceph-conf-emptydir" (UniqueName: "kubernetes.io/empty-dir/08dfe7e7-7320-49db-816e-d258fc45f440-ceph-conf-emptydir") pod "rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:49:27.397471 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-devices") pod "rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:49:27.397531 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-data") pod "rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:49:27.397572 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-crash") pod "rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:49:27.397604 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-binaries" (UniqueName: "kubernetes.io/empty-dir/08dfe7e7-7320-49db-816e-d258fc45f440-rook-binaries") pod "rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:49:27.397633 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-log") pod "rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:49:27.397769 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:49:27.397821 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "udev" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-udev") pod "rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
2020-05-05 07:49:28.125 [INFO][23791] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx", ContainerID:"741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b"}}
2020-05-05 07:49:28.155 [INFO][23791] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0 rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2- rook 08dfe7e7-7320-49db-816e-d258fc45f440 9442 0 2020-05-05 07:49:27 +0000 UTC <nil> <nil> map[app:rook-ceph-osd-prepare ceph.rook.io/pvc: controller-uid:f36a8935-9397-45ca-8348-ce3192e179a5 job-name:rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2 projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali50ef90ee42d []}} ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Namespace="rook" Pod="rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-"
2020-05-05 07:49:28.155 [INFO][23791] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Namespace="rook" Pod="rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:49:28.160 [INFO][23791] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 07:49:28.165 [INFO][23791] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx 08dfe7e7-7320-49db-816e-d258fc45f440 9442 0 2020-05-05 07:49:27 +0000 UTC <nil> <nil> map[app:rook-ceph-osd-prepare ceph.rook.io/pvc: controller-uid:f36a8935-9397-45ca-8348-ce3192e179a5 job-name:rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2 rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{batch/v1 Job rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2 f36a8935-9397-45ca-8348-ce3192e179a5 0xc00055c12b 0xc00055c12c}] [] [{kube-controller-manager Update v1 2020-05-05 07:49:27 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 103 101 110 101 114 97 116 101 78 97 109 101 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 112 34 58 123 125 44 34 102 58 99 101 112 104 46 114 111 111 107 46 105 111 47 112 118 99 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 45 117 105 100 34 58 123 125 44 34 102 58 106 111 98 45 110 97 109 101 34 58 123 125 44 34 102 58 114 111 111 107 95 99 108 117 115 116 101 114 34 58 123 125 125 44 34 102 58 111 119 110 101 114 82 101 102 101 114 101 110 99 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 117 105 100 92 34 58 92 34 102 51 54 97 56 57 51 53 45 57 51 57 55 45 52 53 99 97 45 56 51 52 56 45 99 101 51 49 57 50 101 49 55 57 97 53 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 98 108 111 99 107 79 119 110 101 114 68 101 108 101 116 105 111 110 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 34 58 123 125 44 34 102 58 107 105 110 100 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 117 105 100 34 58 123 125 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 97 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 65 102 102 105 110 105 116 121 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 113 117 105 114 101 100 68 117 114 105 110 103 83 99 104 101 100 117 108 105 110 103 73 103 110 111 114 101 100 68 117 114 105 110 103 69 120 101 99 117 116 105 111 110 34 58 123 34 46 34 58 123 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 84 101 114 109 115 34 58 123 125 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 114 111 118 105 115 105 111 110 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 69 80 72 95 86 79 76 85 77 69 95 68 69 66 85 71 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 69 80 72 95 86 79 76 85 77 69 95 83 75 73 80 95 82 69 83 84 79 82 69 67 79 78 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 68 77 95 68 73 83 65 66 76 69 95 85 68 69 86 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 78 79 68 69 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 65 68 77 73 78 95 83 69 67 82 69 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 67 79 78 70 73 71 95 79 86 69 82 82 73 68 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 86 69 82 83 73 79 78 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 76 85 83 84 69 82 95 73 68 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 76 85 83 84 69 82 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 79 78 70 73 71 95 68 73 82 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 82 85 83 72 77 65 80 95 72 79 83 84 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 68 65 84 65 95 68 69 86 73 67 69 95 70 73 76 84 69 82 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 70 83 73 68 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 76 79 71 95 76 69 86 69 76 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 77 79 78 95 69 78 68 80 79 73 78 84 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 110 102 105 103 77 97 112 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 77 79 78 95 83 69 67 82 69 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 78 79 68 69 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 79 83 68 83 95 80 69 82 95 68 69 86 73 67 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 80 82 73 86 65 84 69 95 73 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 80 85 66 76 73 67 95 73 80 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 34 46 34 58 123 125 44 34 102 58 112 114 105 118 105 108 101 103 101 100 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 82 111 111 116 70 105 108 101 115 121 115 116 101 109 34 58 123 125 44 34 102 58 114 117 110 65 115 78 111 110 82 111 111 116 34 58 123 125 44 34 102 58 114 117 110 65 115 85 115 101 114 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 114 111 111 107 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 114 117 110 47 117 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 114 111 111 107 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 111 103 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 101 110 97 98 108 101 83 101 114 118 105 99 101 76 105 110 107 115 34 58 123 125 44 34 102 58 105 110 105 116 67 111 110 116 97 105 110 101 114 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 111 112 121 45 98 105 110 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 114 111 111 107 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 34 58 123 34 46 34 58 123 125 44 34 102 58 107 117 98 101 114 110 101 116 101 115 46 105 111 47 104 111 115 116 110 97 109 101 34 58 123 125 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 115 101 114 118 105 99 101 65 99 99 111 117 110 116 34 58 123 125 44 34 102 58 115 101 114 118 105 99 101 65 99 99 111 117 110 116 78 97 109 101 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 116 111 108 101 114 97 116 105 111 110 115 34 58 123 125 44 34 102 58 118 111 108 117 109 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 101 112 104 45 99 111 110 102 45 101 109 112 116 121 100 105 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 101 109 112 116 121 68 105 114 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 100 101 118 105 99 101 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 98 105 110 97 114 105 101 115 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 101 109 112 116 121 68 105 114 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 108 111 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 100 97 116 97 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 117 100 101 118 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125],}} {kubelet Update v1 2020-05-05 07:49:27 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 99 111 110 100 105 116 105 111 110 115 34 58 123 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 67 111 110 116 97 105 110 101 114 115 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 73 110 105 116 105 97 108 105 122 101 100 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 104 111 115 116 73 80 34 58 123 125 44 34 102 58 105 110 105 116 67 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 115 116 97 114 116 84 105 109 101 34 58 123 125 125 125],}}]},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:ceph-conf-emptydir,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-binaries,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:provision,Image:ceph/ceph:v14.2.8,Command:[/rook/tini],Args:[-- /rook/rook ceph osd provision],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_LOG_LEVEL,Value:DEBUG,ValueFrom:nil,},EnvVar{Name:ROOK_DATA_DEVICE_FILTER,Value:all,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_VERSION,Value:ceph version 14.2.8-0 nautilus,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:ceph-conf-emptydir,ReadOnly:false,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-binaries,ReadOnly:false,MountPath:/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:OnFailure,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:false,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:copy-bins,Image:rook/ceph:v1.3.2,Command:[],Args:[copy-binaries --copy-to-dir /rook],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-binaries,ReadOnly:false,MountPath:/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:49:27 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [copy-bins],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:49:27 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [provision],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:49:27 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [provision],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:49:27 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 07:49:27 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:provision,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:copy-bins,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:rook/ceph:v1.3.2,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 07:49:28.180 [INFO][23806] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" HandleID="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:49:28.203 [INFO][23806] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" HandleID="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:49:28.203 [INFO][23806] ipam_plugin.go 233: Auto assigning IP ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" HandleID="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000022d40), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 07:49:28.203 [INFO][23806] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 07:49:28.212 [INFO][23806] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.219 [INFO][23806] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.223 [INFO][23806] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.229 [INFO][23806] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.229 [INFO][23806] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.232 [INFO][23806] ipam.go 1265: Creating new handle: k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b
2020-05-05 07:49:28.239 [INFO][23806] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.253 [INFO][23806] ipam.go 840: Successfully claimed IPs: [10.2.76.6/24] block=10.2.76.0/24 handle="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.253 [INFO][23806] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.6/24] handle="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.256 [INFO][23806] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.6/24] handle="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:28.256 [INFO][23806] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.6/24] IPv6=[] ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" HandleID="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:49:28.256 [INFO][23806] ipam_plugin.go 261: IPAM Result ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" HandleID="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0001483c0)}
2020-05-05 07:49:28.258 [INFO][23791] k8s.go 358: Populated endpoint ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Namespace="rook" Pod="rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0", GenerateName:"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-", Namespace:"rook", SelfLink:"", UID:"08dfe7e7-7320-49db-816e-d258fc45f440", ResourceVersion:"9442", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261767, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd-prepare", "ceph.rook.io/pvc":"", "controller-uid":"f36a8935-9397-45ca-8348-ce3192e179a5", "job-name":"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali50ef90ee42d", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:49:28.258 [INFO][23791] k8s.go 359: Calico CNI using IPs: [10.2.76.6/32] ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Namespace="rook" Pod="rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:49:28.258 [INFO][23791] network_linux.go 76: Setting the host side veth name to cali50ef90ee42d ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Namespace="rook" Pod="rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:49:28.259 [INFO][23791] network_linux.go 396: Disabling IPv4 forwarding ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Namespace="rook" Pod="rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:49:28.272 [INFO][23791] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Namespace="rook" Pod="rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0", GenerateName:"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-", Namespace:"rook", SelfLink:"", UID:"08dfe7e7-7320-49db-816e-d258fc45f440", ResourceVersion:"9442", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261767, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd-prepare", "ceph.rook.io/pvc":"", "controller-uid":"f36a8935-9397-45ca-8348-ce3192e179a5", "job-name":"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b", Pod:"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali50ef90ee42d", MAC:"b2:86:1a:79:59:03", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:49:28.287 [INFO][23791] k8s.go 417: Wrote updated endpoint to datastore ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Namespace="rook" Pod="rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:49:31.486 [INFO][24428] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l", ContainerID:"d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8"}}
2020-05-05 07:49:31.516 [INFO][24428] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0 rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-ddc8cbd7- rook 889e0bea-1099-49a7-a960-568cc6f0298a 9193 0 2020-05-05 07:48:59 +0000 UTC <nil> <nil> map[app:rook-ceph-crashcollector ceph-version:14.2.8-0 ceph_daemon_id:crash crashcollector:crash kubernetes.io/hostname:suraj-lk-cluster-pool-1-worker-2 node_name:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:ddc8cbd7 projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default rook-version:v1.3.2 rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l eth0 [] [] [kns.rook ksa.rook.default] calib1ab9bdc7e0 []}} ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Namespace="rook" Pod="rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-"
2020-05-05 07:49:31.516 [INFO][24428] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Namespace="rook" Pod="rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0"
2020-05-05 07:49:31.521 [INFO][24428] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 07:49:31.526 [INFO][24428] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-ddc8cbd7- rook /api/v1/namespaces/rook/pods/rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l 889e0bea-1099-49a7-a960-568cc6f0298a 9193 0 2020-05-05 07:48:59 +0000 UTC <nil> <nil> map[app:rook-ceph-crashcollector ceph-version:14.2.8-0 ceph_daemon_id:crash crashcollector:crash kubernetes.io/hostname:suraj-lk-cluster-pool-1-worker-2 node_name:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:ddc8cbd7 rook-version:v1.3.2 rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-ddc8cbd7 f4065a0c-2940-4e21-bef8-7ed79bb6b31e 0xc00041a167 0xc00041a168}] [] [{kube-controller-manager Update v1 2020-05-05 07:48:59 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 103 101 110 101 114 97 116 101 78 97 109 101 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 112 34 58 123 125 44 34 102 58 99 101 112 104 45 118 101 114 115 105 111 110 34 58 123 125 44 34 102 58 99 101 112 104 95 100 97 101 109 111 110 95 105 100 34 58 123 125 44 34 102 58 99 114 97 115 104 99 111 108 108 101 99 116 111 114 34 58 123 125 44 34 102 58 107 117 98 101 114 110 101 116 101 115 46 105 111 47 104 111 115 116 110 97 109 101 34 58 123 125 44 34 102 58 110 111 100 101 95 110 97 109 101 34 58 123 125 44 34 102 58 112 111 100 45 116 101 109 112 108 97 116 101 45 104 97 115 104 34 58 123 125 44 34 102 58 114 111 111 107 45 118 101 114 115 105 111 110 34 58 123 125 44 34 102 58 114 111 111 107 95 99 108 117 115 116 101 114 34 58 123 125 125 44 34 102 58 111 119 110 101 114 82 101 102 101 114 101 110 99 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 117 105 100 92 34 58 92 34 102 52 48 54 53 97 48 99 45 50 57 52 48 45 52 101 50 49 45 98 101 102 56 45 55 101 100 55 57 98 98 54 98 51 49 101 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 98 108 111 99 107 79 119 110 101 114 68 101 108 101 116 105 111 110 34 58 123 125 44 34 102 58 99 111 110 116 114 111 108 108 101 114 34 58 123 125 44 34 102 58 107 105 110 100 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 117 105 100 34 58 123 125 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 101 112 104 45 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 44 34 102 58 101 110 118 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 69 80 72 95 65 82 71 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 67 79 78 84 65 73 78 69 82 95 73 77 65 71 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 78 79 68 69 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 67 80 85 95 76 73 77 73 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 67 80 85 95 82 69 81 85 69 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 77 69 77 79 82 89 95 76 73 77 73 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 77 69 77 79 82 89 95 82 69 81 85 69 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 70 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 100 105 118 105 115 111 114 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 80 79 68 95 78 65 77 69 83 80 65 67 69 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 102 105 101 108 100 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 97 112 105 86 101 114 115 105 111 110 34 58 123 125 44 34 102 58 102 105 101 108 100 80 97 116 104 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 77 79 78 95 72 79 83 84 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 82 79 79 75 95 67 69 80 72 95 77 79 78 95 73 78 73 84 73 65 76 95 77 69 77 66 69 82 83 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 70 114 111 109 34 58 123 34 46 34 58 123 125 44 34 102 58 115 101 99 114 101 116 75 101 121 82 101 102 34 58 123 34 46 34 58 123 125 44 34 102 58 107 101 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 47 99 114 97 115 104 45 99 111 108 108 101 99 116 111 114 45 107 101 121 114 105 110 103 45 115 116 111 114 101 47 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 111 103 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 101 110 97 98 108 101 83 101 114 118 105 99 101 76 105 110 107 115 34 58 123 125 44 34 102 58 105 110 105 116 67 111 110 116 97 105 110 101 114 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 99 104 111 119 110 45 99 111 110 116 97 105 110 101 114 45 100 97 116 97 45 100 105 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 34 46 34 58 123 125 44 34 102 58 112 114 105 118 105 108 101 103 101 100 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 111 103 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 109 97 107 101 45 99 111 110 116 97 105 110 101 114 45 99 114 97 115 104 45 100 105 114 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 97 114 103 115 34 58 123 125 44 34 102 58 99 111 109 109 97 110 100 34 58 123 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 34 46 34 58 123 125 44 34 102 58 112 114 105 118 105 108 101 103 101 100 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 102 58 118 111 108 117 109 101 77 111 117 110 116 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 101 116 99 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 114 101 97 100 79 110 108 121 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 105 98 47 99 101 112 104 47 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 109 111 117 110 116 80 97 116 104 92 34 58 92 34 47 118 97 114 47 108 111 103 47 99 101 112 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 109 111 117 110 116 80 97 116 104 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125 44 34 102 58 110 111 100 101 83 101 108 101 99 116 111 114 34 58 123 34 46 34 58 123 125 44 34 102 58 107 117 98 101 114 110 101 116 101 115 46 105 111 47 104 111 115 116 110 97 109 101 34 58 123 125 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 102 58 116 111 108 101 114 97 116 105 111 110 115 34 58 123 125 44 34 102 58 118 111 108 117 109 101 115 34 58 123 34 46 34 58 123 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 99 114 97 115 104 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 99 114 97 115 104 45 99 111 108 108 101 99 116 111 114 45 107 101 121 114 105 110 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 115 101 99 114 101 116 34 58 123 34 46 34 58 123 125 44 34 102 58 100 101 102 97 117 108 116 77 111 100 101 34 58 123 125 44 34 102 58 115 101 99 114 101 116 78 97 109 101 34 58 123 125 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 101 112 104 45 108 111 103 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 104 111 115 116 80 97 116 104 34 58 123 34 46 34 58 123 125 44 34 102 58 112 97 116 104 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 114 111 111 107 45 99 111 110 102 105 103 45 111 118 101 114 114 105 100 101 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 99 111 110 102 105 103 77 97 112 34 58 123 34 46 34 58 123 125 44 34 102 58 100 101 102 97 117 108 116 77 111 100 101 34 58 123 125 44 34 102 58 105 116 101 109 115 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 125 44 34 102 58 110 97 109 101 34 58 123 125 125 125 125 125],}} {kubelet Update v1 2020-05-05 07:48:59 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 99 111 110 100 105 116 105 111 110 115 34 58 123 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 67 111 110 116 97 105 110 101 114 115 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 73 110 105 116 105 97 108 105 122 101 100 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 44 34 107 58 123 92 34 116 121 112 101 92 34 58 92 34 82 101 97 100 121 92 34 125 34 58 123 34 46 34 58 123 125 44 34 102 58 108 97 115 116 80 114 111 98 101 84 105 109 101 34 58 123 125 44 34 102 58 108 97 115 116 84 114 97 110 115 105 116 105 111 110 84 105 109 101 34 58 123 125 44 34 102 58 109 101 115 115 97 103 101 34 58 123 125 44 34 102 58 114 101 97 115 111 110 34 58 123 125 44 34 102 58 115 116 97 116 117 115 34 58 123 125 44 34 102 58 116 121 112 101 34 58 123 125 125 125 44 34 102 58 99 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 104 111 115 116 73 80 34 58 123 125 44 34 102 58 105 110 105 116 67 111 110 116 97 105 110 101 114 83 116 97 116 117 115 101 115 34 58 123 125 44 34 102 58 115 116 97 114 116 84 105 109 101 34 58 123 125 125 125],}}]},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash-collector-keyring,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-crash-collector-keyring,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:default-token-cdzd4,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:default-token-cdzd4,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:ceph-crash,Image:ceph/ceph:v14.2.8,Command:[ceph-crash],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:ROOK_CEPH_MON_INITIAL_MEMBERS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_initial_members,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST) -k /etc/ceph/crash-collector-keyring-store/keyring,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash-collector-keyring,ReadOnly:true,MountPath:/etc/ceph/crash-collector-keyring-store/,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:default-token-cdzd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:default,DeprecatedServiceAccount:default,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:false,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:nil,SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:make-container-crash-dir,Image:ceph/ceph:v14.2.8,Command:[mkdir -p],Args:[/var/lib/ceph/crash/posted],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:default-token-cdzd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*false,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:default-token-cdzd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*false,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:59 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [make-container-crash-dir chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:59 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [ceph-crash],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:59 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [ceph-crash],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 07:48:59 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 07:48:59 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:ceph-crash,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:make-container-crash-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 07:49:31.542 [INFO][24442] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" HandleID="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0"
2020-05-05 07:49:31.564 [INFO][24442] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8 ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" HandleID="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0"
2020-05-05 07:49:31.564 [INFO][24442] ipam_plugin.go 233: Auto assigning IP ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" HandleID="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00022b9d0), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 07:49:31.564 [INFO][24442] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 07:49:31.573 [INFO][24442] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.581 [INFO][24442] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.585 [INFO][24442] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.591 [INFO][24442] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.591 [INFO][24442] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.594 [INFO][24442] ipam.go 1265: Creating new handle: k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8
2020-05-05 07:49:31.601 [INFO][24442] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.610 [INFO][24442] ipam.go 840: Successfully claimed IPs: [10.2.76.7/24] block=10.2.76.0/24 handle="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.610 [INFO][24442] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.7/24] handle="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.614 [INFO][24442] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.7/24] handle="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 07:49:31.614 [INFO][24442] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.7/24] IPv6=[] ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" HandleID="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0"
2020-05-05 07:49:31.614 [INFO][24442] ipam_plugin.go 261: IPAM Result ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" HandleID="k8s-pod-network.d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc00014a5a0)}
2020-05-05 07:49:31.615 [INFO][24428] k8s.go 358: Populated endpoint ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Namespace="rook" Pod="rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0", GenerateName:"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-ddc8cbd7-", Namespace:"rook", SelfLink:"", UID:"889e0bea-1099-49a7-a960-568cc6f0298a", ResourceVersion:"9193", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261739, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-crashcollector", "ceph-version":"14.2.8-0", "ceph_daemon_id":"crash", "crashcollector":"crash", "kubernetes.io/hostname":"suraj-lk-cluster-pool-1-worker-2", "node_name":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"ddc8cbd7", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default", "rook-version":"v1.3.2", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.default"}, InterfaceName:"calib1ab9bdc7e0", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:49:31.616 [INFO][24428] k8s.go 359: Calico CNI using IPs: [10.2.76.7/32] ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Namespace="rook" Pod="rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0"
2020-05-05 07:49:31.616 [INFO][24428] network_linux.go 76: Setting the host side veth name to calib1ab9bdc7e0 ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Namespace="rook" Pod="rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0"
2020-05-05 07:49:31.617 [INFO][24428] network_linux.go 396: Disabling IPv4 forwarding ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Namespace="rook" Pod="rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0"
2020-05-05 07:49:31.643 [INFO][24428] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Namespace="rook" Pod="rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0", GenerateName:"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-ddc8cbd7-", Namespace:"rook", SelfLink:"", UID:"889e0bea-1099-49a7-a960-568cc6f0298a", ResourceVersion:"9193", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261739, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-crashcollector", "ceph-version":"14.2.8-0", "ceph_daemon_id":"crash", "crashcollector":"crash", "kubernetes.io/hostname":"suraj-lk-cluster-pool-1-worker-2", "node_name":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"ddc8cbd7", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default", "rook-version":"v1.3.2", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8", Pod:"rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.default"}, InterfaceName:"calib1ab9bdc7e0", MAC:"32:94:9b:a2:aa:26", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:49:31.657 [INFO][24428] k8s.go 417: Wrote updated endpoint to datastore ContainerID="d281b97184c22093ef0cc982b1e511d6e640ca5d139dfb5431a3968764425ed8" Namespace="rook" Pod="rook-ceph-crashcollector-suraj-lk-cluster-pool-1-worker-2-j6b8l" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--crashcollector--suraj--lk--cluster--pool--1--worker--2--j6b8l-eth0"
I0505 07:49:33.813360 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:33.813438 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:33.813477 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:49:33.813517 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 07:59:49.151012 2827 topology_manager.go:219] [topologymanager] RemoveContainer - Container ID: a41c813ec57c96868929dc6d19e9d44dfee519bf35afeedc482def28f9b0badd
2020-05-05 07:59:49.220 [INFO][14244] plugin.go 496: Extracted identifiers ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Node="suraj-lk-cluster-pool-1-worker-2" Orchestrator="k8s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:59:49.232 [WARNING][14244] workloadendpoint.go 77: Operation Delete is not supported on WorkloadEndpoint type
2020-05-05 07:59:49.232 [INFO][14244] k8s.go 467: Endpoint deletion will be handled by Kubernetes deletion of the Pod. ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0", GenerateName:"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-", Namespace:"rook", SelfLink:"", UID:"08dfe7e7-7320-49db-816e-d258fc45f440", ResourceVersion:"12711", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724261767, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd-prepare", "ceph.rook.io/pvc":"", "controller-uid":"f36a8935-9397-45ca-8348-ce3192e179a5", "job-name":"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx", Endpoint:"eth0", IPNetworks:[]string{}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali50ef90ee42d", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 07:59:49.232 [INFO][14244] k8s.go 474: Cleaning up netns ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b"
2020-05-05 07:59:49.232 [INFO][14244] network_linux.go 450: Calico CNI deleting device in netns /proc/23715/ns/net ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b"
2020-05-05 07:59:49.255 [INFO][14244] network_linux.go 467: Calico CNI deleted device in netns /proc/23715/ns/net ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b"
2020-05-05 07:59:49.256 [INFO][14244] k8s.go 481: Releasing IP address(es) ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b"
2020-05-05 07:59:49.256 [INFO][14244] utils.go 168: Calico CNI releasing IP address ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b"
2020-05-05 07:59:49.272 [INFO][14267] ipam_plugin.go 302: Releasing address using handleID ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" HandleID="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:59:49.273 [INFO][14267] ipam.go 1166: Releasing all IPs with handle 'k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b'
I0505 07:59:49.281157 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-devices") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:59:49.281201 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-binaries" (UniqueName: "kubernetes.io/empty-dir/08dfe7e7-7320-49db-816e-d258fc45f440-rook-binaries") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:59:49.281220 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-crash") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:59:49.281244 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-log") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:59:49.281281 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-data") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:59:49.281320 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "udev" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-udev") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:59:49.281332 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-crash" (OuterVolumeSpecName: "rook-ceph-crash") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440"). InnerVolumeSpecName "rook-ceph-crash". PluginName "kubernetes.io/host-path", VolumeGidValue ""
I0505 07:59:49.281361 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "ceph-conf-emptydir" (UniqueName: "kubernetes.io/empty-dir/08dfe7e7-7320-49db-816e-d258fc45f440-ceph-conf-emptydir") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:59:49.281348 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-devices" (OuterVolumeSpecName: "devices") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440"). InnerVolumeSpecName "devices". PluginName "kubernetes.io/host-path", VolumeGidValue ""
W0505 07:59:49.281399 2827 empty_dir.go:453] Warning: Failed to clear quota on /var/lib/kubelet/pods/08dfe7e7-7320-49db-816e-d258fc45f440/volumes/kubernetes.io~empty-dir/rook-binaries: ClearQuota called, but quotas disabled
I0505 07:59:49.281448 2827 reconciler.go:196] operationExecutor.UnmountVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-osd-token-4dtzj") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440")
I0505 07:59:49.281398 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-udev" (OuterVolumeSpecName: "udev") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440"). InnerVolumeSpecName "udev". PluginName "kubernetes.io/host-path", VolumeGidValue ""
I0505 07:59:49.281421 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-data" (OuterVolumeSpecName: "rook-data") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440"). InnerVolumeSpecName "rook-data". PluginName "kubernetes.io/host-path", VolumeGidValue ""
I0505 07:59:49.281388 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-log" (OuterVolumeSpecName: "rook-ceph-log") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440"). InnerVolumeSpecName "rook-ceph-log". PluginName "kubernetes.io/host-path", VolumeGidValue ""
W0505 07:59:49.281540 2827 empty_dir.go:453] Warning: Failed to clear quota on /var/lib/kubelet/pods/08dfe7e7-7320-49db-816e-d258fc45f440/volumes/kubernetes.io~empty-dir/ceph-conf-emptydir: ClearQuota called, but quotas disabled
I0505 07:59:49.281566 2827 reconciler.go:319] Volume detached for volume "devices" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-devices") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:59:49.281580 2827 reconciler.go:319] Volume detached for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-crash") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:59:49.281590 2827 reconciler.go:319] Volume detached for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-log") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:59:49.281601 2827 reconciler.go:319] Volume detached for volume "rook-data" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-rook-data") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:59:49.281609 2827 reconciler.go:319] Volume detached for volume "udev" (UniqueName: "kubernetes.io/host-path/08dfe7e7-7320-49db-816e-d258fc45f440-udev") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:59:49.281721 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08dfe7e7-7320-49db-816e-d258fc45f440-ceph-conf-emptydir" (OuterVolumeSpecName: "ceph-conf-emptydir") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440"). InnerVolumeSpecName "ceph-conf-emptydir". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
I0505 07:59:49.288986 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08dfe7e7-7320-49db-816e-d258fc45f440-rook-binaries" (OuterVolumeSpecName: "rook-binaries") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440"). InnerVolumeSpecName "rook-binaries". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
I0505 07:59:49.304192 2827 operation_generator.go:782] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-osd-token-4dtzj" (OuterVolumeSpecName: "rook-ceph-osd-token-4dtzj") pod "08dfe7e7-7320-49db-816e-d258fc45f440" (UID: "08dfe7e7-7320-49db-816e-d258fc45f440"). InnerVolumeSpecName "rook-ceph-osd-token-4dtzj". PluginName "kubernetes.io/secret", VolumeGidValue ""
2020-05-05 07:59:49.341 [INFO][14267] ipam_plugin.go 314: Released address using handleID ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" HandleID="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:59:49.341 [INFO][14267] ipam_plugin.go 323: Releasing address using workloadID ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" HandleID="k8s-pod-network.741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--prepare--suraj--lk--cluster--pool--1--worker--2--mcldx-eth0"
2020-05-05 07:59:49.341 [INFO][14267] ipam.go 1166: Releasing all IPs with handle 'rook.rook-ceph-osd-prepare-suraj-lk-cluster-pool-1-worker-2-mcldx'
2020-05-05 07:59:49.346 [INFO][14244] k8s.go 487: Teardown processing complete. ContainerID="741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b"
I0505 07:59:49.381899 2827 reconciler.go:319] Volume detached for volume "ceph-conf-emptydir" (UniqueName: "kubernetes.io/empty-dir/08dfe7e7-7320-49db-816e-d258fc45f440-ceph-conf-emptydir") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:59:49.381923 2827 reconciler.go:319] Volume detached for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/08dfe7e7-7320-49db-816e-d258fc45f440-rook-ceph-osd-token-4dtzj") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
I0505 07:59:49.381932 2827 reconciler.go:319] Volume detached for volume "rook-binaries" (UniqueName: "kubernetes.io/empty-dir/08dfe7e7-7320-49db-816e-d258fc45f440-rook-binaries") on node "suraj-lk-cluster-pool-1-worker-2" DevicePath ""
W0505 07:59:50.180278 2827 pod_container_deletor.go:77] Container "741a4b73215d1b9322d1e24280939c2746a988170cf9392505206cf0295ba57b" not found in pod's containers
I0505 08:00:04.616636 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:04.733092 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/6f1e1c07-b491-417c-bb77-bcf2291a6d32-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-34-7d6f567457-v96w9" (UID: "6f1e1c07-b491-417c-bb77-bcf2291a6d32")
I0505 08:00:04.733141 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/6f1e1c07-b491-417c-bb77-bcf2291a6d32-rook-config-override") pod "rook-ceph-osd-34-7d6f567457-v96w9" (UID: "6f1e1c07-b491-417c-bb77-bcf2291a6d32")
I0505 08:00:04.733185 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/6f1e1c07-b491-417c-bb77-bcf2291a6d32-run-udev") pod "rook-ceph-osd-34-7d6f567457-v96w9" (UID: "6f1e1c07-b491-417c-bb77-bcf2291a6d32")
I0505 08:00:04.733242 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/6f1e1c07-b491-417c-bb77-bcf2291a6d32-rook-ceph-log") pod "rook-ceph-osd-34-7d6f567457-v96w9" (UID: "6f1e1c07-b491-417c-bb77-bcf2291a6d32")
I0505 08:00:04.733282 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/6f1e1c07-b491-417c-bb77-bcf2291a6d32-activate-osd") pod "rook-ceph-osd-34-7d6f567457-v96w9" (UID: "6f1e1c07-b491-417c-bb77-bcf2291a6d32")
I0505 08:00:04.733319 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/6f1e1c07-b491-417c-bb77-bcf2291a6d32-rook-ceph-crash") pod "rook-ceph-osd-34-7d6f567457-v96w9" (UID: "6f1e1c07-b491-417c-bb77-bcf2291a6d32")
I0505 08:00:04.733430 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/6f1e1c07-b491-417c-bb77-bcf2291a6d32-rook-data") pod "rook-ceph-osd-34-7d6f567457-v96w9" (UID: "6f1e1c07-b491-417c-bb77-bcf2291a6d32")
I0505 08:00:04.733501 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/6f1e1c07-b491-417c-bb77-bcf2291a6d32-devices") pod "rook-ceph-osd-34-7d6f567457-v96w9" (UID: "6f1e1c07-b491-417c-bb77-bcf2291a6d32")
W0505 08:00:05.458985 2827 pod_container_deletor.go:77] Container "392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" not found in pod's containers
2020-05-05 08:00:05.475 [INFO][14823] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-34-7d6f567457-v96w9", ContainerID:"392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277"}}
2020-05-05 08:00:05.505 [INFO][14823] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0 rook-ceph-osd-34-7d6f567457- rook 6f1e1c07-b491-417c-bb77-bcf2291a6d32 13139 0 2020-05-05 08:00:04 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:34 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:7d6f567457 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-34-7d6f567457-v96w9 eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali846d43d8c8b []}} ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Namespace="rook" Pod="rook-ceph-osd-34-7d6f567457-v96w9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-"
2020-05-05 08:00:05.505 [INFO][14823] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Namespace="rook" Pod="rook-ceph-osd-34-7d6f567457-v96w9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0"
2020-05-05 08:00:05.510 [INFO][14823] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:05.515 [INFO][14823] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-34-7d6f567457-v96w9 rook-ceph-osd-34-7d6f567457- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-34-7d6f567457-v96w9 6f1e1c07-b491-417c-bb77-bcf2291a6d32 13139 0 2020-05-05 08:00:04 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:34 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:7d6f567457 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-34-7d6f567457 67ebab23-707c-4de4-83b9-6d420b9df7aa 0xc000568687 0xc000568688}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 34 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:956a05fa-609a-40df-9a4c-c19bad2ee465,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:34,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-34,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.34.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=34
OSD_UUID=956a05fa-609a-40df-9a4c-c19bad2ee465
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-34,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-34,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:04 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:04 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:04 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:04 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:04 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:05.530 [INFO][14841] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" HandleID="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0"
2020-05-05 08:00:05.552 [INFO][14841] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277 ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" HandleID="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0"
2020-05-05 08:00:05.552 [INFO][14841] ipam_plugin.go 233: Auto assigning IP ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" HandleID="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000217d40), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-34-7d6f567457-v96w9"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:05.552 [INFO][14841] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:05.560 [INFO][14841] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.567 [INFO][14841] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.572 [INFO][14841] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.577 [INFO][14841] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.577 [INFO][14841] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.580 [INFO][14841] ipam.go 1265: Creating new handle: k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277
2020-05-05 08:00:05.587 [INFO][14841] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.596 [INFO][14841] ipam.go 840: Successfully claimed IPs: [10.2.76.8/24] block=10.2.76.0/24 handle="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.596 [INFO][14841] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.8/24] handle="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.600 [INFO][14841] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.8/24] handle="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:05.600 [INFO][14841] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.8/24] IPv6=[] ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" HandleID="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0"
2020-05-05 08:00:05.600 [INFO][14841] ipam_plugin.go 261: IPAM Result ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" HandleID="k8s-pod-network.392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc00002e2a0)}
2020-05-05 08:00:05.601 [INFO][14823] k8s.go 358: Populated endpoint ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Namespace="rook" Pod="rook-ceph-osd-34-7d6f567457-v96w9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0", GenerateName:"rook-ceph-osd-34-7d6f567457-", Namespace:"rook", SelfLink:"", UID:"6f1e1c07-b491-417c-bb77-bcf2291a6d32", ResourceVersion:"13139", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262404, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"34", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"7d6f567457", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-34-7d6f567457-v96w9", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali846d43d8c8b", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:05.601 [INFO][14823] k8s.go 359: Calico CNI using IPs: [10.2.76.8/32] ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Namespace="rook" Pod="rook-ceph-osd-34-7d6f567457-v96w9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0"
2020-05-05 08:00:05.601 [INFO][14823] network_linux.go 76: Setting the host side veth name to cali846d43d8c8b ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Namespace="rook" Pod="rook-ceph-osd-34-7d6f567457-v96w9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0"
2020-05-05 08:00:05.603 [INFO][14823] network_linux.go 396: Disabling IPv4 forwarding ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Namespace="rook" Pod="rook-ceph-osd-34-7d6f567457-v96w9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0"
2020-05-05 08:00:05.616 [INFO][14823] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Namespace="rook" Pod="rook-ceph-osd-34-7d6f567457-v96w9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0", GenerateName:"rook-ceph-osd-34-7d6f567457-", Namespace:"rook", SelfLink:"", UID:"6f1e1c07-b491-417c-bb77-bcf2291a6d32", ResourceVersion:"13139", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262404, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"34", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"7d6f567457", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277", Pod:"rook-ceph-osd-34-7d6f567457-v96w9", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali846d43d8c8b", MAC:"fa:ca:be:00:b7:a9", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:05.625 [INFO][14823] k8s.go 417: Wrote updated endpoint to datastore ContainerID="392b97c25126fdfab7d3d1cd081fcf915b0a78f4e5e564a6c8c9724616c80277" Namespace="rook" Pod="rook-ceph-osd-34-7d6f567457-v96w9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--34--7d6f567457--v96w9-eth0"
I0505 08:00:06.131483 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:06.238722 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/c3a6feef-1dc2-4571-bd9f-b2548704217f-rook-config-override") pod "rook-ceph-osd-28-7ff4f78487-fmgwr" (UID: "c3a6feef-1dc2-4571-bd9f-b2548704217f")
I0505 08:00:06.238759 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/c3a6feef-1dc2-4571-bd9f-b2548704217f-rook-data") pod "rook-ceph-osd-28-7ff4f78487-fmgwr" (UID: "c3a6feef-1dc2-4571-bd9f-b2548704217f")
I0505 08:00:06.238787 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/c3a6feef-1dc2-4571-bd9f-b2548704217f-rook-ceph-crash") pod "rook-ceph-osd-28-7ff4f78487-fmgwr" (UID: "c3a6feef-1dc2-4571-bd9f-b2548704217f")
I0505 08:00:06.238913 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/c3a6feef-1dc2-4571-bd9f-b2548704217f-run-udev") pod "rook-ceph-osd-28-7ff4f78487-fmgwr" (UID: "c3a6feef-1dc2-4571-bd9f-b2548704217f")
I0505 08:00:06.238950 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/c3a6feef-1dc2-4571-bd9f-b2548704217f-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-28-7ff4f78487-fmgwr" (UID: "c3a6feef-1dc2-4571-bd9f-b2548704217f")
I0505 08:00:06.238980 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/c3a6feef-1dc2-4571-bd9f-b2548704217f-rook-ceph-log") pod "rook-ceph-osd-28-7ff4f78487-fmgwr" (UID: "c3a6feef-1dc2-4571-bd9f-b2548704217f")
I0505 08:00:06.239046 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/c3a6feef-1dc2-4571-bd9f-b2548704217f-devices") pod "rook-ceph-osd-28-7ff4f78487-fmgwr" (UID: "c3a6feef-1dc2-4571-bd9f-b2548704217f")
I0505 08:00:06.239071 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/c3a6feef-1dc2-4571-bd9f-b2548704217f-activate-osd") pod "rook-ceph-osd-28-7ff4f78487-fmgwr" (UID: "c3a6feef-1dc2-4571-bd9f-b2548704217f")
2020-05-05 08:00:06.988 [INFO][15005] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-28-7ff4f78487-fmgwr", ContainerID:"5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38"}}
2020-05-05 08:00:07.019 [INFO][15005] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0 rook-ceph-osd-28-7ff4f78487- rook c3a6feef-1dc2-4571-bd9f-b2548704217f 13182 0 2020-05-05 08:00:06 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:28 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:7ff4f78487 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-28-7ff4f78487-fmgwr eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali580c3981b32 []}} ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Namespace="rook" Pod="rook-ceph-osd-28-7ff4f78487-fmgwr" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-"
2020-05-05 08:00:07.019 [INFO][15005] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Namespace="rook" Pod="rook-ceph-osd-28-7ff4f78487-fmgwr" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0"
2020-05-05 08:00:07.025 [INFO][15005] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:07.030 [INFO][15005] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-28-7ff4f78487-fmgwr rook-ceph-osd-28-7ff4f78487- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-28-7ff4f78487-fmgwr c3a6feef-1dc2-4571-bd9f-b2548704217f 13182 0 2020-05-05 08:00:06 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:28 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:7ff4f78487 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-28-7ff4f78487 ffb9389f-a675-4300-aea8-637683376bdc 0xc000102697 0xc000102698}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 28 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:1fa8be45-18fa-4825-ab4e-9b96918615ce,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:28,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-28,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.28.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=28
OSD_UUID=1fa8be45-18fa-4825-ab4e-9b96918615ce
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-28,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-28,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:06 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:06 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:06 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:06 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:06 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:07.046 [INFO][15021] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" HandleID="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0"
2020-05-05 08:00:07.069 [INFO][15021] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38 ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" HandleID="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0"
2020-05-05 08:00:07.070 [INFO][15021] ipam_plugin.go 233: Auto assigning IP ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" HandleID="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002dcd50), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-28-7ff4f78487-fmgwr"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:07.070 [INFO][15021] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:07.079 [INFO][15021] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.086 [INFO][15021] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.090 [INFO][15021] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.096 [INFO][15021] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.096 [INFO][15021] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.099 [INFO][15021] ipam.go 1265: Creating new handle: k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38
2020-05-05 08:00:07.105 [INFO][15021] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.115 [INFO][15021] ipam.go 840: Successfully claimed IPs: [10.2.76.9/24] block=10.2.76.0/24 handle="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.115 [INFO][15021] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.9/24] handle="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.118 [INFO][15021] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.9/24] handle="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:07.118 [INFO][15021] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.9/24] IPv6=[] ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" HandleID="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0"
2020-05-05 08:00:07.118 [INFO][15021] ipam_plugin.go 261: IPAM Result ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" HandleID="k8s-pod-network.5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc000108540)}
2020-05-05 08:00:07.119 [INFO][15005] k8s.go 358: Populated endpoint ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Namespace="rook" Pod="rook-ceph-osd-28-7ff4f78487-fmgwr" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0", GenerateName:"rook-ceph-osd-28-7ff4f78487-", Namespace:"rook", SelfLink:"", UID:"c3a6feef-1dc2-4571-bd9f-b2548704217f", ResourceVersion:"13182", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262406, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"28", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"7ff4f78487", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-28-7ff4f78487-fmgwr", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali580c3981b32", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:07.120 [INFO][15005] k8s.go 359: Calico CNI using IPs: [10.2.76.9/32] ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Namespace="rook" Pod="rook-ceph-osd-28-7ff4f78487-fmgwr" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0"
2020-05-05 08:00:07.120 [INFO][15005] network_linux.go 76: Setting the host side veth name to cali580c3981b32 ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Namespace="rook" Pod="rook-ceph-osd-28-7ff4f78487-fmgwr" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0"
2020-05-05 08:00:07.121 [INFO][15005] network_linux.go 396: Disabling IPv4 forwarding ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Namespace="rook" Pod="rook-ceph-osd-28-7ff4f78487-fmgwr" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0"
2020-05-05 08:00:07.148 [INFO][15005] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Namespace="rook" Pod="rook-ceph-osd-28-7ff4f78487-fmgwr" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0", GenerateName:"rook-ceph-osd-28-7ff4f78487-", Namespace:"rook", SelfLink:"", UID:"c3a6feef-1dc2-4571-bd9f-b2548704217f", ResourceVersion:"13182", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262406, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"28", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"7ff4f78487", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38", Pod:"rook-ceph-osd-28-7ff4f78487-fmgwr", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali580c3981b32", MAC:"0a:60:1a:96:77:7e", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:07.158 [INFO][15005] k8s.go 417: Wrote updated endpoint to datastore ContainerID="5efe6fce0adccd83d96aa2921fecf20a6277848e9a6ef4fc83ceb33b78b76a38" Namespace="rook" Pod="rook-ceph-osd-28-7ff4f78487-fmgwr" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--28--7ff4f78487--fmgwr-eth0"
I0505 08:00:07.336097 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:07.443006 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/71dd27f4-1862-4153-a950-49026d348fcb-rook-config-override") pod "rook-ceph-osd-31-6f9fb79fd9-drckn" (UID: "71dd27f4-1862-4153-a950-49026d348fcb")
I0505 08:00:07.443090 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/71dd27f4-1862-4153-a950-49026d348fcb-rook-ceph-log") pod "rook-ceph-osd-31-6f9fb79fd9-drckn" (UID: "71dd27f4-1862-4153-a950-49026d348fcb")
I0505 08:00:07.443135 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/71dd27f4-1862-4153-a950-49026d348fcb-devices") pod "rook-ceph-osd-31-6f9fb79fd9-drckn" (UID: "71dd27f4-1862-4153-a950-49026d348fcb")
I0505 08:00:07.443173 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/71dd27f4-1862-4153-a950-49026d348fcb-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-31-6f9fb79fd9-drckn" (UID: "71dd27f4-1862-4153-a950-49026d348fcb")
I0505 08:00:07.443195 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/71dd27f4-1862-4153-a950-49026d348fcb-activate-osd") pod "rook-ceph-osd-31-6f9fb79fd9-drckn" (UID: "71dd27f4-1862-4153-a950-49026d348fcb")
I0505 08:00:07.443244 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/71dd27f4-1862-4153-a950-49026d348fcb-run-udev") pod "rook-ceph-osd-31-6f9fb79fd9-drckn" (UID: "71dd27f4-1862-4153-a950-49026d348fcb")
I0505 08:00:07.443289 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/71dd27f4-1862-4153-a950-49026d348fcb-rook-data") pod "rook-ceph-osd-31-6f9fb79fd9-drckn" (UID: "71dd27f4-1862-4153-a950-49026d348fcb")
I0505 08:00:07.443326 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/71dd27f4-1862-4153-a950-49026d348fcb-rook-ceph-crash") pod "rook-ceph-osd-31-6f9fb79fd9-drckn" (UID: "71dd27f4-1862-4153-a950-49026d348fcb")
2020-05-05 08:00:08.176 [INFO][15196] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-31-6f9fb79fd9-drckn", ContainerID:"caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2"}}
2020-05-05 08:00:08.206 [INFO][15196] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0 rook-ceph-osd-31-6f9fb79fd9- rook 71dd27f4-1862-4153-a950-49026d348fcb 13217 0 2020-05-05 08:00:07 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:31 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:6f9fb79fd9 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-31-6f9fb79fd9-drckn eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] calief390596d06 []}} ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Namespace="rook" Pod="rook-ceph-osd-31-6f9fb79fd9-drckn" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-"
2020-05-05 08:00:08.206 [INFO][15196] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Namespace="rook" Pod="rook-ceph-osd-31-6f9fb79fd9-drckn" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0"
2020-05-05 08:00:08.212 [INFO][15196] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:08.217 [INFO][15196] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-31-6f9fb79fd9-drckn rook-ceph-osd-31-6f9fb79fd9- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-31-6f9fb79fd9-drckn 71dd27f4-1862-4153-a950-49026d348fcb 13217 0 2020-05-05 08:00:07 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:31 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:6f9fb79fd9 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-31-6f9fb79fd9 3a890131-698b-47af-9943-2813ca4adcfe 0xc0004a2677 0xc0004a2678}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 31 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:1b12fdb2-da36-422d-ba75-30209268d4d5,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:31,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-31,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.31.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=31
OSD_UUID=1b12fdb2-da36-422d-ba75-30209268d4d5
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-31,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-31,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:07 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:07 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:07 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:07 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:07 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:08.233 [INFO][15215] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" HandleID="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0"
2020-05-05 08:00:08.257 [INFO][15215] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2 ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" HandleID="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0"
2020-05-05 08:00:08.257 [INFO][15215] ipam_plugin.go 233: Auto assigning IP ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" HandleID="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00002ad80), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-31-6f9fb79fd9-drckn"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:08.257 [INFO][15215] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:08.266 [INFO][15215] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.274 [INFO][15215] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.278 [INFO][15215] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.284 [INFO][15215] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.284 [INFO][15215] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.287 [INFO][15215] ipam.go 1265: Creating new handle: k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2
2020-05-05 08:00:08.293 [INFO][15215] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.303 [INFO][15215] ipam.go 840: Successfully claimed IPs: [10.2.76.10/24] block=10.2.76.0/24 handle="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.303 [INFO][15215] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.10/24] handle="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.306 [INFO][15215] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.10/24] handle="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:08.306 [INFO][15215] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.10/24] IPv6=[] ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" HandleID="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0"
2020-05-05 08:00:08.306 [INFO][15215] ipam_plugin.go 261: IPAM Result ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" HandleID="k8s-pod-network.caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0001084e0)}
2020-05-05 08:00:08.307 [INFO][15196] k8s.go 358: Populated endpoint ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Namespace="rook" Pod="rook-ceph-osd-31-6f9fb79fd9-drckn" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0", GenerateName:"rook-ceph-osd-31-6f9fb79fd9-", Namespace:"rook", SelfLink:"", UID:"71dd27f4-1862-4153-a950-49026d348fcb", ResourceVersion:"13217", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262407, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"31", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"6f9fb79fd9", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-31-6f9fb79fd9-drckn", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.10/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calief390596d06", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:08.307 [INFO][15196] k8s.go 359: Calico CNI using IPs: [10.2.76.10/32] ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Namespace="rook" Pod="rook-ceph-osd-31-6f9fb79fd9-drckn" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0"
2020-05-05 08:00:08.307 [INFO][15196] network_linux.go 76: Setting the host side veth name to calief390596d06 ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Namespace="rook" Pod="rook-ceph-osd-31-6f9fb79fd9-drckn" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0"
2020-05-05 08:00:08.308 [INFO][15196] network_linux.go 396: Disabling IPv4 forwarding ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Namespace="rook" Pod="rook-ceph-osd-31-6f9fb79fd9-drckn" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0"
2020-05-05 08:00:08.322 [INFO][15196] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Namespace="rook" Pod="rook-ceph-osd-31-6f9fb79fd9-drckn" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0", GenerateName:"rook-ceph-osd-31-6f9fb79fd9-", Namespace:"rook", SelfLink:"", UID:"71dd27f4-1862-4153-a950-49026d348fcb", ResourceVersion:"13217", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262407, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"31", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"6f9fb79fd9", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2", Pod:"rook-ceph-osd-31-6f9fb79fd9-drckn", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.10/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calief390596d06", MAC:"be:ca:da:77:15:b7", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:08.330 [INFO][15196] k8s.go 417: Wrote updated endpoint to datastore ContainerID="caaefcd8644a0c3a375186bcf53f8fe9e1779293f5e9ef05ab127cae89f5b2a2" Namespace="rook" Pod="rook-ceph-osd-31-6f9fb79fd9-drckn" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--31--6f9fb79fd9--drckn-eth0"
I0505 08:00:08.626681 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:08.749436 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/c602b2f8-5c9e-45c4-bc6f-d50ed74100c6-rook-config-override") pod "rook-ceph-osd-40-586c89dcbf-j55s9" (UID: "c602b2f8-5c9e-45c4-bc6f-d50ed74100c6")
I0505 08:00:08.749473 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/c602b2f8-5c9e-45c4-bc6f-d50ed74100c6-run-udev") pod "rook-ceph-osd-40-586c89dcbf-j55s9" (UID: "c602b2f8-5c9e-45c4-bc6f-d50ed74100c6")
I0505 08:00:08.749517 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/c602b2f8-5c9e-45c4-bc6f-d50ed74100c6-devices") pod "rook-ceph-osd-40-586c89dcbf-j55s9" (UID: "c602b2f8-5c9e-45c4-bc6f-d50ed74100c6")
I0505 08:00:08.749646 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/c602b2f8-5c9e-45c4-bc6f-d50ed74100c6-rook-ceph-log") pod "rook-ceph-osd-40-586c89dcbf-j55s9" (UID: "c602b2f8-5c9e-45c4-bc6f-d50ed74100c6")
I0505 08:00:08.749705 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/c602b2f8-5c9e-45c4-bc6f-d50ed74100c6-rook-ceph-crash") pod "rook-ceph-osd-40-586c89dcbf-j55s9" (UID: "c602b2f8-5c9e-45c4-bc6f-d50ed74100c6")
I0505 08:00:08.749725 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/c602b2f8-5c9e-45c4-bc6f-d50ed74100c6-rook-data") pod "rook-ceph-osd-40-586c89dcbf-j55s9" (UID: "c602b2f8-5c9e-45c4-bc6f-d50ed74100c6")
I0505 08:00:08.749752 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/c602b2f8-5c9e-45c4-bc6f-d50ed74100c6-activate-osd") pod "rook-ceph-osd-40-586c89dcbf-j55s9" (UID: "c602b2f8-5c9e-45c4-bc6f-d50ed74100c6")
I0505 08:00:08.749846 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/c602b2f8-5c9e-45c4-bc6f-d50ed74100c6-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-40-586c89dcbf-j55s9" (UID: "c602b2f8-5c9e-45c4-bc6f-d50ed74100c6")
2020-05-05 08:00:09.520 [INFO][15450] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-40-586c89dcbf-j55s9", ContainerID:"96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175"}}
2020-05-05 08:00:09.550 [INFO][15450] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0 rook-ceph-osd-40-586c89dcbf- rook c602b2f8-5c9e-45c4-bc6f-d50ed74100c6 13261 0 2020-05-05 08:00:08 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:40 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:586c89dcbf portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-40-586c89dcbf-j55s9 eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] calibc73db732dc []}} ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Namespace="rook" Pod="rook-ceph-osd-40-586c89dcbf-j55s9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-"
2020-05-05 08:00:09.550 [INFO][15450] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Namespace="rook" Pod="rook-ceph-osd-40-586c89dcbf-j55s9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0"
2020-05-05 08:00:09.556 [INFO][15450] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:09.561 [INFO][15450] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-40-586c89dcbf-j55s9 rook-ceph-osd-40-586c89dcbf- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-40-586c89dcbf-j55s9 c602b2f8-5c9e-45c4-bc6f-d50ed74100c6 13261 0 2020-05-05 08:00:08 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:40 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:586c89dcbf portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-40-586c89dcbf c4dfbc2b-fede-42ca-bb25-8cf5da417df9 0xc0001a8117 0xc0001a8118}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 40 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:d8e19f2b-b2d3-4f6b-bb0e-2ed2782e12e2,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:40,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-40,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.40.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=40
OSD_UUID=d8e19f2b-b2d3-4f6b-bb0e-2ed2782e12e2
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-40,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-40,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:08 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:08 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:08 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:08 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:08 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:09.577 [INFO][15463] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" HandleID="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0"
2020-05-05 08:00:09.599 [INFO][15463] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175 ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" HandleID="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0"
2020-05-05 08:00:09.599 [INFO][15463] ipam_plugin.go 233: Auto assigning IP ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" HandleID="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000274d0), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-40-586c89dcbf-j55s9"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:09.600 [INFO][15463] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:09.609 [INFO][15463] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.616 [INFO][15463] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.621 [INFO][15463] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.627 [INFO][15463] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.627 [INFO][15463] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.633 [INFO][15463] ipam.go 1265: Creating new handle: k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175
2020-05-05 08:00:09.639 [INFO][15463] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.651 [INFO][15463] ipam.go 840: Successfully claimed IPs: [10.2.76.11/24] block=10.2.76.0/24 handle="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.651 [INFO][15463] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.11/24] handle="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.655 [INFO][15463] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.11/24] handle="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:09.655 [INFO][15463] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.11/24] IPv6=[] ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" HandleID="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0"
2020-05-05 08:00:09.655 [INFO][15463] ipam_plugin.go 261: IPAM Result ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" HandleID="k8s-pod-network.96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc000154300)}
2020-05-05 08:00:09.657 [INFO][15450] k8s.go 358: Populated endpoint ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Namespace="rook" Pod="rook-ceph-osd-40-586c89dcbf-j55s9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0", GenerateName:"rook-ceph-osd-40-586c89dcbf-", Namespace:"rook", SelfLink:"", UID:"c602b2f8-5c9e-45c4-bc6f-d50ed74100c6", ResourceVersion:"13261", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262408, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"40", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"586c89dcbf", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-40-586c89dcbf-j55s9", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.11/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calibc73db732dc", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:09.657 [INFO][15450] k8s.go 359: Calico CNI using IPs: [10.2.76.11/32] ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Namespace="rook" Pod="rook-ceph-osd-40-586c89dcbf-j55s9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0"
2020-05-05 08:00:09.657 [INFO][15450] network_linux.go 76: Setting the host side veth name to calibc73db732dc ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Namespace="rook" Pod="rook-ceph-osd-40-586c89dcbf-j55s9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0"
2020-05-05 08:00:09.658 [INFO][15450] network_linux.go 396: Disabling IPv4 forwarding ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Namespace="rook" Pod="rook-ceph-osd-40-586c89dcbf-j55s9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0"
2020-05-05 08:00:09.672 [INFO][15450] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Namespace="rook" Pod="rook-ceph-osd-40-586c89dcbf-j55s9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0", GenerateName:"rook-ceph-osd-40-586c89dcbf-", Namespace:"rook", SelfLink:"", UID:"c602b2f8-5c9e-45c4-bc6f-d50ed74100c6", ResourceVersion:"13261", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262408, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"40", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"586c89dcbf", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175", Pod:"rook-ceph-osd-40-586c89dcbf-j55s9", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.11/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calibc73db732dc", MAC:"3e:15:1c:fb:c5:1c", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:09.680 [INFO][15450] k8s.go 417: Wrote updated endpoint to datastore ContainerID="96046ef67ed529403e1c2f718ae6e847de4619ecb90a6c5f9ad4a3a9ae787175" Namespace="rook" Pod="rook-ceph-osd-40-586c89dcbf-j55s9" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--40--586c89dcbf--j55s9-eth0"
I0505 08:00:10.120193 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:10.158021 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80-run-udev") pod "rook-ceph-osd-8-5d45b68997-dg6bh" (UID: "ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80")
I0505 08:00:10.158100 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80-rook-config-override") pod "rook-ceph-osd-8-5d45b68997-dg6bh" (UID: "ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80")
I0505 08:00:10.158252 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80-rook-ceph-crash") pod "rook-ceph-osd-8-5d45b68997-dg6bh" (UID: "ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80")
I0505 08:00:10.158303 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-8-5d45b68997-dg6bh" (UID: "ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80")
I0505 08:00:10.158328 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80-activate-osd") pod "rook-ceph-osd-8-5d45b68997-dg6bh" (UID: "ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80")
I0505 08:00:10.158357 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80-rook-data") pod "rook-ceph-osd-8-5d45b68997-dg6bh" (UID: "ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80")
I0505 08:00:10.158448 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80-rook-ceph-log") pod "rook-ceph-osd-8-5d45b68997-dg6bh" (UID: "ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80")
I0505 08:00:10.158482 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80-devices") pod "rook-ceph-osd-8-5d45b68997-dg6bh" (UID: "ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80")
W0505 08:00:11.213803 2827 pod_container_deletor.go:77] Container "d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" not found in pod's containers
2020-05-05 08:00:11.232 [INFO][15774] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-8-5d45b68997-dg6bh", ContainerID:"d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96"}}
2020-05-05 08:00:11.269 [INFO][15774] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0 rook-ceph-osd-8-5d45b68997- rook ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80 13305 0 2020-05-05 08:00:10 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:8 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:5d45b68997 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-8-5d45b68997-dg6bh eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] caliaed69304c2a []}} ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Namespace="rook" Pod="rook-ceph-osd-8-5d45b68997-dg6bh" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-"
2020-05-05 08:00:11.269 [INFO][15774] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Namespace="rook" Pod="rook-ceph-osd-8-5d45b68997-dg6bh" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0"
2020-05-05 08:00:11.275 [INFO][15774] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:11.281 [INFO][15774] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-8-5d45b68997-dg6bh rook-ceph-osd-8-5d45b68997- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-8-5d45b68997-dg6bh ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80 13305 0 2020-05-05 08:00:10 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:8 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:5d45b68997 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-8-5d45b68997 91e8a6df-df89-421a-9f18-0ec720f27ac2 0xc00003e157 0xc00003e158}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 8 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:ce4041d0-0cab-4b95-8b49-4f1e22df1f92,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:8,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-8,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.8.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=8
OSD_UUID=ce4041d0-0cab-4b95-8b49-4f1e22df1f92
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-8,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-8,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:10 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:10 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:10 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:10 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:10 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:11.304 [INFO][15795] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" HandleID="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0"
I0505 08:00:11.314866 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
2020-05-05 08:00:11.332 [INFO][15795] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96 ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" HandleID="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0"
2020-05-05 08:00:11.332 [INFO][15795] ipam_plugin.go 233: Auto assigning IP ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" HandleID="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c6d50), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-8-5d45b68997-dg6bh"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:11.332 [INFO][15795] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:11.343 [INFO][15795] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:11.353 [INFO][15795] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:11.358 [INFO][15795] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:11.363 [INFO][15795] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:11.363 [INFO][15795] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" host="suraj-lk-cluster-pool-1-worker-2"
I0505 08:00:11.363798 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/c2e9690a-6ba6-4df2-a52c-874554237d9e-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-1-58787d844c-kbvhl" (UID: "c2e9690a-6ba6-4df2-a52c-874554237d9e")
I0505 08:00:11.363935 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/c2e9690a-6ba6-4df2-a52c-874554237d9e-devices") pod "rook-ceph-osd-1-58787d844c-kbvhl" (UID: "c2e9690a-6ba6-4df2-a52c-874554237d9e")
I0505 08:00:11.364008 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/c2e9690a-6ba6-4df2-a52c-874554237d9e-run-udev") pod "rook-ceph-osd-1-58787d844c-kbvhl" (UID: "c2e9690a-6ba6-4df2-a52c-874554237d9e")
I0505 08:00:11.364059 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/c2e9690a-6ba6-4df2-a52c-874554237d9e-activate-osd") pod "rook-ceph-osd-1-58787d844c-kbvhl" (UID: "c2e9690a-6ba6-4df2-a52c-874554237d9e")
I0505 08:00:11.364164 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/c2e9690a-6ba6-4df2-a52c-874554237d9e-rook-data") pod "rook-ceph-osd-1-58787d844c-kbvhl" (UID: "c2e9690a-6ba6-4df2-a52c-874554237d9e")
I0505 08:00:11.364326 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/c2e9690a-6ba6-4df2-a52c-874554237d9e-rook-ceph-log") pod "rook-ceph-osd-1-58787d844c-kbvhl" (UID: "c2e9690a-6ba6-4df2-a52c-874554237d9e")
I0505 08:00:11.364426 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/c2e9690a-6ba6-4df2-a52c-874554237d9e-rook-config-override") pod "rook-ceph-osd-1-58787d844c-kbvhl" (UID: "c2e9690a-6ba6-4df2-a52c-874554237d9e")
I0505 08:00:11.364496 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/c2e9690a-6ba6-4df2-a52c-874554237d9e-rook-ceph-crash") pod "rook-ceph-osd-1-58787d844c-kbvhl" (UID: "c2e9690a-6ba6-4df2-a52c-874554237d9e")
2020-05-05 08:00:11.367 [INFO][15795] ipam.go 1265: Creating new handle: k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96
2020-05-05 08:00:11.374 [INFO][15795] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:11.386 [INFO][15795] ipam.go 840: Successfully claimed IPs: [10.2.76.12/24] block=10.2.76.0/24 handle="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:11.386 [INFO][15795] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.12/24] handle="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:11.389 [INFO][15795] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.12/24] handle="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:11.389 [INFO][15795] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.12/24] IPv6=[] ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" HandleID="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0"
2020-05-05 08:00:11.389 [INFO][15795] ipam_plugin.go 261: IPAM Result ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" HandleID="k8s-pod-network.d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0000c64e0)}
2020-05-05 08:00:11.390 [INFO][15774] k8s.go 358: Populated endpoint ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Namespace="rook" Pod="rook-ceph-osd-8-5d45b68997-dg6bh" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0", GenerateName:"rook-ceph-osd-8-5d45b68997-", Namespace:"rook", SelfLink:"", UID:"ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80", ResourceVersion:"13305", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262410, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"8", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"5d45b68997", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-8-5d45b68997-dg6bh", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.12/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"caliaed69304c2a", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:11.391 [INFO][15774] k8s.go 359: Calico CNI using IPs: [10.2.76.12/32] ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Namespace="rook" Pod="rook-ceph-osd-8-5d45b68997-dg6bh" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0"
2020-05-05 08:00:11.391 [INFO][15774] network_linux.go 76: Setting the host side veth name to caliaed69304c2a ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Namespace="rook" Pod="rook-ceph-osd-8-5d45b68997-dg6bh" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0"
2020-05-05 08:00:11.392 [INFO][15774] network_linux.go 396: Disabling IPv4 forwarding ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Namespace="rook" Pod="rook-ceph-osd-8-5d45b68997-dg6bh" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0"
2020-05-05 08:00:11.411 [INFO][15774] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Namespace="rook" Pod="rook-ceph-osd-8-5d45b68997-dg6bh" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0", GenerateName:"rook-ceph-osd-8-5d45b68997-", Namespace:"rook", SelfLink:"", UID:"ee2b4cb0-68a5-4f70-ba63-cfdb6bd83b80", ResourceVersion:"13305", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262410, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"8", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"5d45b68997", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96", Pod:"rook-ceph-osd-8-5d45b68997-dg6bh", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.12/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"caliaed69304c2a", MAC:"a6:38:8c:f5:e5:9f", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:11.420 [INFO][15774] k8s.go 417: Wrote updated endpoint to datastore ContainerID="d8558e0e457205a8d90c2061f2a3f4960d96c7ffa4434c897a0b2dcc978ace96" Namespace="rook" Pod="rook-ceph-osd-8-5d45b68997-dg6bh" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--8--5d45b68997--dg6bh-eth0"
2020-05-05 08:00:12.193 [INFO][16153] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-1-58787d844c-kbvhl", ContainerID:"d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf"}}
2020-05-05 08:00:12.228 [INFO][16153] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0 rook-ceph-osd-1-58787d844c- rook c2e9690a-6ba6-4df2-a52c-874554237d9e 13342 0 2020-05-05 08:00:11 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:1 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:58787d844c portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-1-58787d844c-kbvhl eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali053489b3808 []}} ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Namespace="rook" Pod="rook-ceph-osd-1-58787d844c-kbvhl" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-"
2020-05-05 08:00:12.228 [INFO][16153] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Namespace="rook" Pod="rook-ceph-osd-1-58787d844c-kbvhl" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0"
2020-05-05 08:00:12.233 [INFO][16153] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:12.239 [INFO][16153] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-1-58787d844c-kbvhl rook-ceph-osd-1-58787d844c- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-1-58787d844c-kbvhl c2e9690a-6ba6-4df2-a52c-874554237d9e 13342 0 2020-05-05 08:00:11 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:1 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:58787d844c portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-1-58787d844c 4b66f108-4b2a-49c0-b65b-ddef47a631a6 0xc00055c267 0xc00055c268}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 1 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:ac9c9458-ea17-4741-9957-2688f1173c38,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-1,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.1.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=1
OSD_UUID=ac9c9458-ea17-4741-9957-2688f1173c38
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-1,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-1,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:11 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:11 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:11 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:11 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:11 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:12.259 [INFO][16173] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" HandleID="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0"
2020-05-05 08:00:12.285 [INFO][16173] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" HandleID="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0"
2020-05-05 08:00:12.285 [INFO][16173] ipam_plugin.go 233: Auto assigning IP ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" HandleID="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000536760), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-1-58787d844c-kbvhl"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:12.285 [INFO][16173] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
I0505 08:00:12.287881 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:12.287995 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:12.288074 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:12.288143 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
2020-05-05 08:00:12.295 [INFO][16173] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.306 [INFO][16173] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.310 [INFO][16173] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.316 [INFO][16173] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.316 [INFO][16173] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.321 [INFO][16173] ipam.go 1265: Creating new handle: k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf
2020-05-05 08:00:12.327 [INFO][16173] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.337 [INFO][16173] ipam.go 840: Successfully claimed IPs: [10.2.76.13/24] block=10.2.76.0/24 handle="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.337 [INFO][16173] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.13/24] handle="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.345 [INFO][16173] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.13/24] handle="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:12.345 [INFO][16173] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.13/24] IPv6=[] ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" HandleID="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0"
2020-05-05 08:00:12.345 [INFO][16173] ipam_plugin.go 261: IPAM Result ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" HandleID="k8s-pod-network.d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc000119c80)}
2020-05-05 08:00:12.346 [INFO][16153] k8s.go 358: Populated endpoint ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Namespace="rook" Pod="rook-ceph-osd-1-58787d844c-kbvhl" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0", GenerateName:"rook-ceph-osd-1-58787d844c-", Namespace:"rook", SelfLink:"", UID:"c2e9690a-6ba6-4df2-a52c-874554237d9e", ResourceVersion:"13342", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262411, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"1", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"58787d844c", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-1-58787d844c-kbvhl", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.13/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali053489b3808", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:12.346 [INFO][16153] k8s.go 359: Calico CNI using IPs: [10.2.76.13/32] ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Namespace="rook" Pod="rook-ceph-osd-1-58787d844c-kbvhl" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0"
2020-05-05 08:00:12.346 [INFO][16153] network_linux.go 76: Setting the host side veth name to cali053489b3808 ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Namespace="rook" Pod="rook-ceph-osd-1-58787d844c-kbvhl" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0"
2020-05-05 08:00:12.347 [INFO][16153] network_linux.go 396: Disabling IPv4 forwarding ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Namespace="rook" Pod="rook-ceph-osd-1-58787d844c-kbvhl" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0"
2020-05-05 08:00:12.362 [INFO][16153] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Namespace="rook" Pod="rook-ceph-osd-1-58787d844c-kbvhl" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0", GenerateName:"rook-ceph-osd-1-58787d844c-", Namespace:"rook", SelfLink:"", UID:"c2e9690a-6ba6-4df2-a52c-874554237d9e", ResourceVersion:"13342", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262411, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"1", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"58787d844c", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf", Pod:"rook-ceph-osd-1-58787d844c-kbvhl", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.13/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali053489b3808", MAC:"e6:ff:13:8a:e0:e5", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:12.373 [INFO][16153] k8s.go 417: Wrote updated endpoint to datastore ContainerID="d5ad4db75e6d395306ef6f90cb5e3af69f4014e6e9a100e2fb2aa54a370abdcf" Namespace="rook" Pod="rook-ceph-osd-1-58787d844c-kbvhl" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--1--58787d844c--kbvhl-eth0"
I0505 08:00:12.508253 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:12.573085 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/27c38234-78be-4830-a7f4-eea4018cea52-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-16-5df96f74dc-5rv7s" (UID: "27c38234-78be-4830-a7f4-eea4018cea52")
I0505 08:00:12.573144 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/27c38234-78be-4830-a7f4-eea4018cea52-rook-config-override") pod "rook-ceph-osd-16-5df96f74dc-5rv7s" (UID: "27c38234-78be-4830-a7f4-eea4018cea52")
I0505 08:00:12.573182 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/27c38234-78be-4830-a7f4-eea4018cea52-rook-ceph-log") pod "rook-ceph-osd-16-5df96f74dc-5rv7s" (UID: "27c38234-78be-4830-a7f4-eea4018cea52")
I0505 08:00:12.573362 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/27c38234-78be-4830-a7f4-eea4018cea52-devices") pod "rook-ceph-osd-16-5df96f74dc-5rv7s" (UID: "27c38234-78be-4830-a7f4-eea4018cea52")
I0505 08:00:12.573400 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/27c38234-78be-4830-a7f4-eea4018cea52-run-udev") pod "rook-ceph-osd-16-5df96f74dc-5rv7s" (UID: "27c38234-78be-4830-a7f4-eea4018cea52")
I0505 08:00:12.573436 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/27c38234-78be-4830-a7f4-eea4018cea52-activate-osd") pod "rook-ceph-osd-16-5df96f74dc-5rv7s" (UID: "27c38234-78be-4830-a7f4-eea4018cea52")
I0505 08:00:12.573466 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/27c38234-78be-4830-a7f4-eea4018cea52-rook-data") pod "rook-ceph-osd-16-5df96f74dc-5rv7s" (UID: "27c38234-78be-4830-a7f4-eea4018cea52")
I0505 08:00:12.573501 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/27c38234-78be-4830-a7f4-eea4018cea52-rook-ceph-crash") pod "rook-ceph-osd-16-5df96f74dc-5rv7s" (UID: "27c38234-78be-4830-a7f4-eea4018cea52")
I0505 08:00:13.352050 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:13.352137 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:13.352204 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:13.352248 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
W0505 08:00:13.531969 2827 pod_container_deletor.go:77] Container "197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" not found in pod's containers
2020-05-05 08:00:13.550 [INFO][16608] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-16-5df96f74dc-5rv7s", ContainerID:"197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38"}}
2020-05-05 08:00:13.582 [INFO][16608] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0 rook-ceph-osd-16-5df96f74dc- rook 27c38234-78be-4830-a7f4-eea4018cea52 13391 0 2020-05-05 08:00:12 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:16 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:5df96f74dc portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-16-5df96f74dc-5rv7s eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] calid482c6effaa []}} ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Namespace="rook" Pod="rook-ceph-osd-16-5df96f74dc-5rv7s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-"
2020-05-05 08:00:13.582 [INFO][16608] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Namespace="rook" Pod="rook-ceph-osd-16-5df96f74dc-5rv7s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0"
2020-05-05 08:00:13.587 [INFO][16608] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:13.593 [INFO][16608] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-16-5df96f74dc-5rv7s rook-ceph-osd-16-5df96f74dc- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-16-5df96f74dc-5rv7s 27c38234-78be-4830-a7f4-eea4018cea52 13391 0 2020-05-05 08:00:12 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:16 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:5df96f74dc portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-16-5df96f74dc 231cb6b0-5f96-4ae7-aa9d-3791f1175558 0xc0003a0dd7 0xc0003a0dd8}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 16 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:2d1ceeef-dff4-41c9-8b80-f700a6221265,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:16,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-16,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.16.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=16
OSD_UUID=2d1ceeef-dff4-41c9-8b80-f700a6221265
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-16,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-16,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:12 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:12 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:12 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:12 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:12 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:13.610 [INFO][16645] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" HandleID="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0"
2020-05-05 08:00:13.633 [INFO][16645] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38 ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" HandleID="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0"
2020-05-05 08:00:13.633 [INFO][16645] ipam_plugin.go 233: Auto assigning IP ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" HandleID="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002b0d40), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-16-5df96f74dc-5rv7s"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:13.633 [INFO][16645] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:13.642 [INFO][16645] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.649 [INFO][16645] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.654 [INFO][16645] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.659 [INFO][16645] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.659 [INFO][16645] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.662 [INFO][16645] ipam.go 1265: Creating new handle: k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38
2020-05-05 08:00:13.669 [INFO][16645] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.680 [INFO][16645] ipam.go 840: Successfully claimed IPs: [10.2.76.14/24] block=10.2.76.0/24 handle="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.680 [INFO][16645] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.14/24] handle="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.684 [INFO][16645] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.14/24] handle="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:13.684 [INFO][16645] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.14/24] IPv6=[] ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" HandleID="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0"
2020-05-05 08:00:13.684 [INFO][16645] ipam_plugin.go 261: IPAM Result ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" HandleID="k8s-pod-network.197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0002e6180)}
2020-05-05 08:00:13.685 [INFO][16608] k8s.go 358: Populated endpoint ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Namespace="rook" Pod="rook-ceph-osd-16-5df96f74dc-5rv7s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0", GenerateName:"rook-ceph-osd-16-5df96f74dc-", Namespace:"rook", SelfLink:"", UID:"27c38234-78be-4830-a7f4-eea4018cea52", ResourceVersion:"13391", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262412, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"16", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"5df96f74dc", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-16-5df96f74dc-5rv7s", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.14/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calid482c6effaa", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:13.686 [INFO][16608] k8s.go 359: Calico CNI using IPs: [10.2.76.14/32] ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Namespace="rook" Pod="rook-ceph-osd-16-5df96f74dc-5rv7s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0"
2020-05-05 08:00:13.686 [INFO][16608] network_linux.go 76: Setting the host side veth name to calid482c6effaa ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Namespace="rook" Pod="rook-ceph-osd-16-5df96f74dc-5rv7s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0"
2020-05-05 08:00:13.687 [INFO][16608] network_linux.go 396: Disabling IPv4 forwarding ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Namespace="rook" Pod="rook-ceph-osd-16-5df96f74dc-5rv7s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0"
2020-05-05 08:00:13.708 [INFO][16608] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Namespace="rook" Pod="rook-ceph-osd-16-5df96f74dc-5rv7s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0", GenerateName:"rook-ceph-osd-16-5df96f74dc-", Namespace:"rook", SelfLink:"", UID:"27c38234-78be-4830-a7f4-eea4018cea52", ResourceVersion:"13391", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262412, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"16", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"5df96f74dc", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38", Pod:"rook-ceph-osd-16-5df96f74dc-5rv7s", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.14/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calid482c6effaa", MAC:"ba:9f:7f:ed:ec:81", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:13.718 [INFO][16608] k8s.go 417: Wrote updated endpoint to datastore ContainerID="197c08bb9fa7d2b237749df44615c6d498acf850f913e3d14719b204a35bac38" Namespace="rook" Pod="rook-ceph-osd-16-5df96f74dc-5rv7s" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--16--5df96f74dc--5rv7s-eth0"
I0505 08:00:13.746191 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:13.779808 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/034a7e17-889c-4789-8467-627ac8c6a2fd-rook-ceph-crash") pod "rook-ceph-osd-19-574d768c4f-2vhzk" (UID: "034a7e17-889c-4789-8467-627ac8c6a2fd")
I0505 08:00:13.779869 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/034a7e17-889c-4789-8467-627ac8c6a2fd-rook-ceph-log") pod "rook-ceph-osd-19-574d768c4f-2vhzk" (UID: "034a7e17-889c-4789-8467-627ac8c6a2fd")
I0505 08:00:13.779918 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/034a7e17-889c-4789-8467-627ac8c6a2fd-devices") pod "rook-ceph-osd-19-574d768c4f-2vhzk" (UID: "034a7e17-889c-4789-8467-627ac8c6a2fd")
I0505 08:00:13.779966 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/034a7e17-889c-4789-8467-627ac8c6a2fd-activate-osd") pod "rook-ceph-osd-19-574d768c4f-2vhzk" (UID: "034a7e17-889c-4789-8467-627ac8c6a2fd")
I0505 08:00:13.780061 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/034a7e17-889c-4789-8467-627ac8c6a2fd-rook-config-override") pod "rook-ceph-osd-19-574d768c4f-2vhzk" (UID: "034a7e17-889c-4789-8467-627ac8c6a2fd")
I0505 08:00:13.780094 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/034a7e17-889c-4789-8467-627ac8c6a2fd-rook-data") pod "rook-ceph-osd-19-574d768c4f-2vhzk" (UID: "034a7e17-889c-4789-8467-627ac8c6a2fd")
I0505 08:00:13.780120 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/034a7e17-889c-4789-8467-627ac8c6a2fd-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-19-574d768c4f-2vhzk" (UID: "034a7e17-889c-4789-8467-627ac8c6a2fd")
I0505 08:00:13.780159 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/034a7e17-889c-4789-8467-627ac8c6a2fd-run-udev") pod "rook-ceph-osd-19-574d768c4f-2vhzk" (UID: "034a7e17-889c-4789-8467-627ac8c6a2fd")
W0505 08:00:14.626957 2827 pod_container_deletor.go:77] Container "142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" not found in pod's containers
2020-05-05 08:00:14.645 [INFO][16950] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-19-574d768c4f-2vhzk", ContainerID:"142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7"}}
2020-05-05 08:00:14.676 [INFO][16950] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0 rook-ceph-osd-19-574d768c4f- rook 034a7e17-889c-4789-8467-627ac8c6a2fd 13440 0 2020-05-05 08:00:13 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:19 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:574d768c4f portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-19-574d768c4f-2vhzk eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] calid14f9626b39 []}} ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Namespace="rook" Pod="rook-ceph-osd-19-574d768c4f-2vhzk" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-"
2020-05-05 08:00:14.677 [INFO][16950] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Namespace="rook" Pod="rook-ceph-osd-19-574d768c4f-2vhzk" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0"
2020-05-05 08:00:14.681 [INFO][16950] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:14.687 [INFO][16950] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-19-574d768c4f-2vhzk rook-ceph-osd-19-574d768c4f- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-19-574d768c4f-2vhzk 034a7e17-889c-4789-8467-627ac8c6a2fd 13440 0 2020-05-05 08:00:13 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:19 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:574d768c4f portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-19-574d768c4f ad6f8b41-2e90-45b3-8140-6a80fd269ac5 0xc00040a167 0xc00040a168}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 19 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:08756d31-40e4-4322-bc93-091d042906d8,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:19,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-19,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.19.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=19
OSD_UUID=08756d31-40e4-4322-bc93-091d042906d8
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-19,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-19,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:13 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:13 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:13 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:13 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:13 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:14.716 [INFO][16969] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" HandleID="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0"
2020-05-05 08:00:14.741 [INFO][16969] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7 ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" HandleID="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0"
2020-05-05 08:00:14.741 [INFO][16969] ipam_plugin.go 233: Auto assigning IP ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" HandleID="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00020f9c0), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-19-574d768c4f-2vhzk"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:14.741 [INFO][16969] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:14.751 [INFO][16969] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.758 [INFO][16969] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.762 [INFO][16969] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.768 [INFO][16969] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.768 [INFO][16969] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.771 [INFO][16969] ipam.go 1265: Creating new handle: k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7
2020-05-05 08:00:14.778 [INFO][16969] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.789 [INFO][16969] ipam.go 840: Successfully claimed IPs: [10.2.76.15/24] block=10.2.76.0/24 handle="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.789 [INFO][16969] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.15/24] handle="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.792 [INFO][16969] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.15/24] handle="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:14.792 [INFO][16969] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.15/24] IPv6=[] ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" HandleID="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0"
2020-05-05 08:00:14.792 [INFO][16969] ipam_plugin.go 261: IPAM Result ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" HandleID="k8s-pod-network.142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0001482a0)}
2020-05-05 08:00:14.794 [INFO][16950] k8s.go 358: Populated endpoint ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Namespace="rook" Pod="rook-ceph-osd-19-574d768c4f-2vhzk" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0", GenerateName:"rook-ceph-osd-19-574d768c4f-", Namespace:"rook", SelfLink:"", UID:"034a7e17-889c-4789-8467-627ac8c6a2fd", ResourceVersion:"13440", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262413, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"19", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"574d768c4f", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-19-574d768c4f-2vhzk", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.15/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calid14f9626b39", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:14.795 [INFO][16950] k8s.go 359: Calico CNI using IPs: [10.2.76.15/32] ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Namespace="rook" Pod="rook-ceph-osd-19-574d768c4f-2vhzk" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0"
2020-05-05 08:00:14.795 [INFO][16950] network_linux.go 76: Setting the host side veth name to calid14f9626b39 ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Namespace="rook" Pod="rook-ceph-osd-19-574d768c4f-2vhzk" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0"
2020-05-05 08:00:14.796 [INFO][16950] network_linux.go 396: Disabling IPv4 forwarding ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Namespace="rook" Pod="rook-ceph-osd-19-574d768c4f-2vhzk" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0"
2020-05-05 08:00:14.816 [INFO][16950] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Namespace="rook" Pod="rook-ceph-osd-19-574d768c4f-2vhzk" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0", GenerateName:"rook-ceph-osd-19-574d768c4f-", Namespace:"rook", SelfLink:"", UID:"034a7e17-889c-4789-8467-627ac8c6a2fd", ResourceVersion:"13440", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262413, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"19", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"574d768c4f", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7", Pod:"rook-ceph-osd-19-574d768c4f-2vhzk", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.15/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calid14f9626b39", MAC:"16:5c:e8:63:ab:67", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:14.825 [INFO][16950] k8s.go 417: Wrote updated endpoint to datastore ContainerID="142323bcf3814fb2fcf423f1266db49a31ddf4a90f8ce026e4b28d12db95dce7" Namespace="rook" Pod="rook-ceph-osd-19-574d768c4f-2vhzk" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--19--574d768c4f--2vhzk-eth0"
I0505 08:00:14.923864 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:14.986133 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/5ca66d38-244e-4b92-99a7-df543d3a0cba-devices") pod "rook-ceph-osd-25-d6f885cd7-kfw86" (UID: "5ca66d38-244e-4b92-99a7-df543d3a0cba")
I0505 08:00:14.986210 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/5ca66d38-244e-4b92-99a7-df543d3a0cba-activate-osd") pod "rook-ceph-osd-25-d6f885cd7-kfw86" (UID: "5ca66d38-244e-4b92-99a7-df543d3a0cba")
I0505 08:00:14.986299 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/5ca66d38-244e-4b92-99a7-df543d3a0cba-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-25-d6f885cd7-kfw86" (UID: "5ca66d38-244e-4b92-99a7-df543d3a0cba")
I0505 08:00:14.986338 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/5ca66d38-244e-4b92-99a7-df543d3a0cba-run-udev") pod "rook-ceph-osd-25-d6f885cd7-kfw86" (UID: "5ca66d38-244e-4b92-99a7-df543d3a0cba")
I0505 08:00:14.986417 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/5ca66d38-244e-4b92-99a7-df543d3a0cba-rook-config-override") pod "rook-ceph-osd-25-d6f885cd7-kfw86" (UID: "5ca66d38-244e-4b92-99a7-df543d3a0cba")
I0505 08:00:14.986457 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/5ca66d38-244e-4b92-99a7-df543d3a0cba-rook-ceph-crash") pod "rook-ceph-osd-25-d6f885cd7-kfw86" (UID: "5ca66d38-244e-4b92-99a7-df543d3a0cba")
I0505 08:00:14.986496 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/5ca66d38-244e-4b92-99a7-df543d3a0cba-rook-ceph-log") pod "rook-ceph-osd-25-d6f885cd7-kfw86" (UID: "5ca66d38-244e-4b92-99a7-df543d3a0cba")
I0505 08:00:14.986556 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/5ca66d38-244e-4b92-99a7-df543d3a0cba-rook-data") pod "rook-ceph-osd-25-d6f885cd7-kfw86" (UID: "5ca66d38-244e-4b92-99a7-df543d3a0cba")
I0505 08:00:15.666048 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:15.666159 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:15.666249 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:15.666290 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
W0505 08:00:15.831576 2827 pod_container_deletor.go:77] Container "6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" not found in pod's containers
2020-05-05 08:00:15.859 [INFO][17452] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-25-d6f885cd7-kfw86", ContainerID:"6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e"}}
2020-05-05 08:00:15.896 [INFO][17452] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0 rook-ceph-osd-25-d6f885cd7- rook 5ca66d38-244e-4b92-99a7-df543d3a0cba 13489 0 2020-05-05 08:00:14 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:25 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:d6f885cd7 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-25-d6f885cd7-kfw86 eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali7275bc2105b []}} ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Namespace="rook" Pod="rook-ceph-osd-25-d6f885cd7-kfw86" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-"
2020-05-05 08:00:15.896 [INFO][17452] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Namespace="rook" Pod="rook-ceph-osd-25-d6f885cd7-kfw86" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0"
2020-05-05 08:00:15.903 [INFO][17452] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:15.909 [INFO][17452] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-25-d6f885cd7-kfw86 rook-ceph-osd-25-d6f885cd7- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-25-d6f885cd7-kfw86 5ca66d38-244e-4b92-99a7-df543d3a0cba 13489 0 2020-05-05 08:00:14 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:25 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:d6f885cd7 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-25-d6f885cd7 a205083a-ae12-49d2-8d0f-a7ef5b6397a4 0xc00044a137 0xc00044a138}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 25 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:68012b22-0671-4031-8b15-867e874a8be4,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:25,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-25,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.25.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=25
OSD_UUID=68012b22-0671-4031-8b15-867e874a8be4
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-25,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-25,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:14 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:14 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:14 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:14 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:14 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:15.935 [INFO][17489] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" HandleID="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0"
2020-05-05 08:00:15.957 [INFO][17489] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" HandleID="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0"
2020-05-05 08:00:15.957 [INFO][17489] ipam_plugin.go 233: Auto assigning IP ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" HandleID="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002b70c0), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-25-d6f885cd7-kfw86"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:15.957 [INFO][17489] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:15.966 [INFO][17489] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:15.974 [INFO][17489] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:15.979 [INFO][17489] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:15.984 [INFO][17489] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:15.984 [INFO][17489] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:15.988 [INFO][17489] ipam.go 1265: Creating new handle: k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e
2020-05-05 08:00:15.995 [INFO][17489] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:16.006 [INFO][17489] ipam.go 840: Successfully claimed IPs: [10.2.76.16/24] block=10.2.76.0/24 handle="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:16.006 [INFO][17489] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.16/24] handle="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:16.009 [INFO][17489] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.16/24] handle="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:16.009 [INFO][17489] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.16/24] IPv6=[] ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" HandleID="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0"
2020-05-05 08:00:16.009 [INFO][17489] ipam_plugin.go 261: IPAM Result ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" HandleID="k8s-pod-network.6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc00038a360)}
2020-05-05 08:00:16.011 [INFO][17452] k8s.go 358: Populated endpoint ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Namespace="rook" Pod="rook-ceph-osd-25-d6f885cd7-kfw86" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0", GenerateName:"rook-ceph-osd-25-d6f885cd7-", Namespace:"rook", SelfLink:"", UID:"5ca66d38-244e-4b92-99a7-df543d3a0cba", ResourceVersion:"13489", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262414, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"25", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"d6f885cd7", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-25-d6f885cd7-kfw86", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.16/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali7275bc2105b", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:16.011 [INFO][17452] k8s.go 359: Calico CNI using IPs: [10.2.76.16/32] ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Namespace="rook" Pod="rook-ceph-osd-25-d6f885cd7-kfw86" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0"
2020-05-05 08:00:16.011 [INFO][17452] network_linux.go 76: Setting the host side veth name to cali7275bc2105b ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Namespace="rook" Pod="rook-ceph-osd-25-d6f885cd7-kfw86" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0"
2020-05-05 08:00:16.012 [INFO][17452] network_linux.go 396: Disabling IPv4 forwarding ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Namespace="rook" Pod="rook-ceph-osd-25-d6f885cd7-kfw86" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0"
2020-05-05 08:00:16.024 [INFO][17452] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Namespace="rook" Pod="rook-ceph-osd-25-d6f885cd7-kfw86" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0", GenerateName:"rook-ceph-osd-25-d6f885cd7-", Namespace:"rook", SelfLink:"", UID:"5ca66d38-244e-4b92-99a7-df543d3a0cba", ResourceVersion:"13489", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262414, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"25", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"d6f885cd7", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e", Pod:"rook-ceph-osd-25-d6f885cd7-kfw86", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.16/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali7275bc2105b", MAC:"92:2b:47:c7:e0:5d", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:16.040 [INFO][17452] k8s.go 417: Wrote updated endpoint to datastore ContainerID="6436b1421eed914751bd6c38c0065eb07397e6a82424bdf6dbcc5a1f99e0e00e" Namespace="rook" Pod="rook-ceph-osd-25-d6f885cd7-kfw86" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--25--d6f885cd7--kfw86-eth0"
I0505 08:00:16.114248 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:16.192705 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a-rook-ceph-crash") pod "rook-ceph-osd-10-559c967dc5-fmw5g" (UID: "4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a")
I0505 08:00:16.192769 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a-rook-config-override") pod "rook-ceph-osd-10-559c967dc5-fmw5g" (UID: "4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a")
I0505 08:00:16.192823 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a-rook-data") pod "rook-ceph-osd-10-559c967dc5-fmw5g" (UID: "4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a")
I0505 08:00:16.192853 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a-activate-osd") pod "rook-ceph-osd-10-559c967dc5-fmw5g" (UID: "4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a")
I0505 08:00:16.192891 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a-devices") pod "rook-ceph-osd-10-559c967dc5-fmw5g" (UID: "4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a")
I0505 08:00:16.192986 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a-run-udev") pod "rook-ceph-osd-10-559c967dc5-fmw5g" (UID: "4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a")
I0505 08:00:16.193101 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a-rook-ceph-log") pod "rook-ceph-osd-10-559c967dc5-fmw5g" (UID: "4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a")
I0505 08:00:16.193159 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-10-559c967dc5-fmw5g" (UID: "4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a")
W0505 08:00:17.031440 2827 pod_container_deletor.go:77] Container "8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" not found in pod's containers
2020-05-05 08:00:17.052 [INFO][17893] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-10-559c967dc5-fmw5g", ContainerID:"8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651"}}
I0505 08:00:17.060097 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:17.060216 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:17.060313 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:17.060360 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
2020-05-05 08:00:17.084 [INFO][17893] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0 rook-ceph-osd-10-559c967dc5- rook 4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a 13527 0 2020-05-05 08:00:16 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:10 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:559c967dc5 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-10-559c967dc5-fmw5g eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali4268edd20ff []}} ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Namespace="rook" Pod="rook-ceph-osd-10-559c967dc5-fmw5g" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-"
2020-05-05 08:00:17.085 [INFO][17893] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Namespace="rook" Pod="rook-ceph-osd-10-559c967dc5-fmw5g" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0"
2020-05-05 08:00:17.091 [INFO][17893] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:17.095 [INFO][17893] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-10-559c967dc5-fmw5g rook-ceph-osd-10-559c967dc5- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-10-559c967dc5-fmw5g 4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a 13527 0 2020-05-05 08:00:16 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:10 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:559c967dc5 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-10-559c967dc5 44c99488-f7d9-4e32-9106-2e480bae0a6b 0xc00043d257 0xc00043d258}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 10 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:e3e1fdfb-0bcf-4a30-a8c0-3698c9cf51f4,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:10,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-10,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.10.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=10
OSD_UUID=e3e1fdfb-0bcf-4a30-a8c0-3698c9cf51f4
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-10,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-10,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:16 +0000 UTC,Reason:ContainersNotInitialized,Message:containers with incomplete status: [activate chown-container-data-dir],},PodCondition{Type:Ready,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:16 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:ContainersReady,Status:False,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:16 +0000 UTC,Reason:ContainersNotReady,Message:containers with unready status: [osd],},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:16 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:10.88.81.5,PodIP:,StartTime:2020-05-05 08:00:16 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:osd,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:*false,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{ContainerStatus{Name:activate,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},ContainerStatus{Name:chown-container-data-dir,State:ContainerState{Waiting:&ContainerStateWaiting{Reason:PodInitializing,Message:,},Running:nil,Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:false,RestartCount:0,Image:ceph/ceph:v14.2.8,ImageID:,ContainerID:,Started:nil,},},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:17.121 [INFO][17931] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" HandleID="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0"
2020-05-05 08:00:17.144 [INFO][17931] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651 ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" HandleID="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0"
2020-05-05 08:00:17.144 [INFO][17931] ipam_plugin.go 233: Auto assigning IP ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" HandleID="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000026d40), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-10-559c967dc5-fmw5g"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:17.145 [INFO][17931] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:17.154 [INFO][17931] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.161 [INFO][17931] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.166 [INFO][17931] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.172 [INFO][17931] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.172 [INFO][17931] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.175 [INFO][17931] ipam.go 1265: Creating new handle: k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651
2020-05-05 08:00:17.188 [INFO][17931] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.199 [INFO][17931] ipam.go 840: Successfully claimed IPs: [10.2.76.17/24] block=10.2.76.0/24 handle="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.199 [INFO][17931] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.17/24] handle="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.202 [INFO][17931] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.17/24] handle="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:17.203 [INFO][17931] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.17/24] IPv6=[] ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" HandleID="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0"
2020-05-05 08:00:17.203 [INFO][17931] ipam_plugin.go 261: IPAM Result ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" HandleID="k8s-pod-network.8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc00042a240)}
2020-05-05 08:00:17.204 [INFO][17893] k8s.go 358: Populated endpoint ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Namespace="rook" Pod="rook-ceph-osd-10-559c967dc5-fmw5g" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0", GenerateName:"rook-ceph-osd-10-559c967dc5-", Namespace:"rook", SelfLink:"", UID:"4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a", ResourceVersion:"13527", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262416, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"10", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"559c967dc5", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-10-559c967dc5-fmw5g", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.17/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali4268edd20ff", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:17.205 [INFO][17893] k8s.go 359: Calico CNI using IPs: [10.2.76.17/32] ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Namespace="rook" Pod="rook-ceph-osd-10-559c967dc5-fmw5g" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0"
2020-05-05 08:00:17.205 [INFO][17893] network_linux.go 76: Setting the host side veth name to cali4268edd20ff ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Namespace="rook" Pod="rook-ceph-osd-10-559c967dc5-fmw5g" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0"
2020-05-05 08:00:17.206 [INFO][17893] network_linux.go 396: Disabling IPv4 forwarding ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Namespace="rook" Pod="rook-ceph-osd-10-559c967dc5-fmw5g" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0"
2020-05-05 08:00:17.225 [INFO][17893] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Namespace="rook" Pod="rook-ceph-osd-10-559c967dc5-fmw5g" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0", GenerateName:"rook-ceph-osd-10-559c967dc5-", Namespace:"rook", SelfLink:"", UID:"4a7ffc5a-f935-4dfd-afb2-ace32a14cf7a", ResourceVersion:"13527", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262416, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"10", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"559c967dc5", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651", Pod:"rook-ceph-osd-10-559c967dc5-fmw5g", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.17/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali4268edd20ff", MAC:"62:40:86:7c:3e:17", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:17.236 [INFO][17893] k8s.go 417: Wrote updated endpoint to datastore ContainerID="8485bb790b571db55031716f9febb549a7901a5ac2abbad2cf85bf701aeb1651" Namespace="rook" Pod="rook-ceph-osd-10-559c967dc5-fmw5g" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--10--559c967dc5--fmw5g-eth0"
I0505 08:00:17.382214 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:17.500146 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/05f7eb7d-48e2-4d58-9be7-e915c83fdf77-rook-ceph-crash") pod "rook-ceph-osd-13-8656f94c65-cm6tv" (UID: "05f7eb7d-48e2-4d58-9be7-e915c83fdf77")
I0505 08:00:17.500203 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/05f7eb7d-48e2-4d58-9be7-e915c83fdf77-rook-config-override") pod "rook-ceph-osd-13-8656f94c65-cm6tv" (UID: "05f7eb7d-48e2-4d58-9be7-e915c83fdf77")
I0505 08:00:17.500373 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/05f7eb7d-48e2-4d58-9be7-e915c83fdf77-devices") pod "rook-ceph-osd-13-8656f94c65-cm6tv" (UID: "05f7eb7d-48e2-4d58-9be7-e915c83fdf77")
I0505 08:00:17.500433 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/05f7eb7d-48e2-4d58-9be7-e915c83fdf77-rook-ceph-log") pod "rook-ceph-osd-13-8656f94c65-cm6tv" (UID: "05f7eb7d-48e2-4d58-9be7-e915c83fdf77")
I0505 08:00:17.500466 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/05f7eb7d-48e2-4d58-9be7-e915c83fdf77-activate-osd") pod "rook-ceph-osd-13-8656f94c65-cm6tv" (UID: "05f7eb7d-48e2-4d58-9be7-e915c83fdf77")
I0505 08:00:17.500514 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/05f7eb7d-48e2-4d58-9be7-e915c83fdf77-rook-data") pod "rook-ceph-osd-13-8656f94c65-cm6tv" (UID: "05f7eb7d-48e2-4d58-9be7-e915c83fdf77")
I0505 08:00:17.500540 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/05f7eb7d-48e2-4d58-9be7-e915c83fdf77-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-13-8656f94c65-cm6tv" (UID: "05f7eb7d-48e2-4d58-9be7-e915c83fdf77")
I0505 08:00:17.500655 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/05f7eb7d-48e2-4d58-9be7-e915c83fdf77-run-udev") pod "rook-ceph-osd-13-8656f94c65-cm6tv" (UID: "05f7eb7d-48e2-4d58-9be7-e915c83fdf77")
W0505 08:00:18.299793 2827 pod_container_deletor.go:77] Container "0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" not found in pod's containers
2020-05-05 08:00:18.319 [INFO][18335] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-13-8656f94c65-cm6tv", ContainerID:"0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863"}}
2020-05-05 08:00:18.351 [INFO][18335] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0 rook-ceph-osd-13-8656f94c65- rook 05f7eb7d-48e2-4d58-9be7-e915c83fdf77 13545 0 2020-05-05 08:00:17 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:13 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:8656f94c65 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-13-8656f94c65-cm6tv eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] calie7d5bcc577f []}} ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Namespace="rook" Pod="rook-ceph-osd-13-8656f94c65-cm6tv" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-"
2020-05-05 08:00:18.351 [INFO][18335] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Namespace="rook" Pod="rook-ceph-osd-13-8656f94c65-cm6tv" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0"
2020-05-05 08:00:18.356 [INFO][18335] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:18.362 [INFO][18335] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-13-8656f94c65-cm6tv rook-ceph-osd-13-8656f94c65- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-13-8656f94c65-cm6tv 05f7eb7d-48e2-4d58-9be7-e915c83fdf77 13545 0 2020-05-05 08:00:17 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:13 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:8656f94c65 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-13-8656f94c65 93ed6061-bd1f-4c9f-8510-162d76490ba0 0xc000102567 0xc000102568}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 13 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:6b631e32-b309-4da6-be0b-d70088c762f1,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:13,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-13,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.13.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=13
OSD_UUID=6b631e32-b309-4da6-be0b-d70088c762f1
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-13,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-13,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:17 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:,PodIP:,StartTime:<nil>,ContainerStatuses:[]ContainerStatus{},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:18.390 [INFO][18374] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" HandleID="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0"
2020-05-05 08:00:18.413 [INFO][18374] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863 ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" HandleID="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0"
2020-05-05 08:00:18.413 [INFO][18374] ipam_plugin.go 233: Auto assigning IP ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" HandleID="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0007a7160), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-13-8656f94c65-cm6tv"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:18.413 [INFO][18374] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:18.421 [INFO][18374] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.429 [INFO][18374] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.436 [INFO][18374] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.448 [INFO][18374] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.448 [INFO][18374] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.452 [INFO][18374] ipam.go 1265: Creating new handle: k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863
2020-05-05 08:00:18.458 [INFO][18374] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.470 [INFO][18374] ipam.go 840: Successfully claimed IPs: [10.2.76.18/24] block=10.2.76.0/24 handle="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.470 [INFO][18374] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.18/24] handle="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.473 [INFO][18374] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.18/24] handle="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:18.473 [INFO][18374] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.18/24] IPv6=[] ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" HandleID="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0"
2020-05-05 08:00:18.473 [INFO][18374] ipam_plugin.go 261: IPAM Result ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" HandleID="k8s-pod-network.0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc000108480)}
2020-05-05 08:00:18.475 [INFO][18335] k8s.go 358: Populated endpoint ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Namespace="rook" Pod="rook-ceph-osd-13-8656f94c65-cm6tv" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0", GenerateName:"rook-ceph-osd-13-8656f94c65-", Namespace:"rook", SelfLink:"", UID:"05f7eb7d-48e2-4d58-9be7-e915c83fdf77", ResourceVersion:"13545", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262417, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"13", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"8656f94c65", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-13-8656f94c65-cm6tv", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.18/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calie7d5bcc577f", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:18.475 [INFO][18335] k8s.go 359: Calico CNI using IPs: [10.2.76.18/32] ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Namespace="rook" Pod="rook-ceph-osd-13-8656f94c65-cm6tv" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0"
2020-05-05 08:00:18.475 [INFO][18335] network_linux.go 76: Setting the host side veth name to calie7d5bcc577f ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Namespace="rook" Pod="rook-ceph-osd-13-8656f94c65-cm6tv" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0"
2020-05-05 08:00:18.477 [INFO][18335] network_linux.go 396: Disabling IPv4 forwarding ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Namespace="rook" Pod="rook-ceph-osd-13-8656f94c65-cm6tv" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0"
2020-05-05 08:00:18.492 [INFO][18335] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Namespace="rook" Pod="rook-ceph-osd-13-8656f94c65-cm6tv" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0", GenerateName:"rook-ceph-osd-13-8656f94c65-", Namespace:"rook", SelfLink:"", UID:"05f7eb7d-48e2-4d58-9be7-e915c83fdf77", ResourceVersion:"13545", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262417, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"13", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"8656f94c65", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863", Pod:"rook-ceph-osd-13-8656f94c65-cm6tv", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.18/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"calie7d5bcc577f", MAC:"e6:6a:71:b3:37:24", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:18.502 [INFO][18335] k8s.go 417: Wrote updated endpoint to datastore ContainerID="0bfdc3c5fd183397cca01f01e3623a72eee509ead4b47c2f75baf4160ad64863" Namespace="rook" Pod="rook-ceph-osd-13-8656f94c65-cm6tv" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--13--8656f94c65--cm6tv-eth0"
I0505 08:00:18.608379 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:18.707525 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/c6ed9db1-a2cd-4864-8270-6070bb164ad7-rook-config-override") pod "rook-ceph-osd-22-5f6d6bf7c5-zrlqj" (UID: "c6ed9db1-a2cd-4864-8270-6070bb164ad7")
I0505 08:00:18.707581 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/c6ed9db1-a2cd-4864-8270-6070bb164ad7-rook-ceph-log") pod "rook-ceph-osd-22-5f6d6bf7c5-zrlqj" (UID: "c6ed9db1-a2cd-4864-8270-6070bb164ad7")
I0505 08:00:18.707671 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/c6ed9db1-a2cd-4864-8270-6070bb164ad7-rook-data") pod "rook-ceph-osd-22-5f6d6bf7c5-zrlqj" (UID: "c6ed9db1-a2cd-4864-8270-6070bb164ad7")
I0505 08:00:18.707746 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/c6ed9db1-a2cd-4864-8270-6070bb164ad7-devices") pod "rook-ceph-osd-22-5f6d6bf7c5-zrlqj" (UID: "c6ed9db1-a2cd-4864-8270-6070bb164ad7")
I0505 08:00:18.707787 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/c6ed9db1-a2cd-4864-8270-6070bb164ad7-run-udev") pod "rook-ceph-osd-22-5f6d6bf7c5-zrlqj" (UID: "c6ed9db1-a2cd-4864-8270-6070bb164ad7")
I0505 08:00:18.707819 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/c6ed9db1-a2cd-4864-8270-6070bb164ad7-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-22-5f6d6bf7c5-zrlqj" (UID: "c6ed9db1-a2cd-4864-8270-6070bb164ad7")
I0505 08:00:18.707908 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/c6ed9db1-a2cd-4864-8270-6070bb164ad7-rook-ceph-crash") pod "rook-ceph-osd-22-5f6d6bf7c5-zrlqj" (UID: "c6ed9db1-a2cd-4864-8270-6070bb164ad7")
I0505 08:00:18.707973 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/c6ed9db1-a2cd-4864-8270-6070bb164ad7-activate-osd") pod "rook-ceph-osd-22-5f6d6bf7c5-zrlqj" (UID: "c6ed9db1-a2cd-4864-8270-6070bb164ad7")
W0505 08:00:19.585370 2827 pod_container_deletor.go:77] Container "05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" not found in pod's containers
2020-05-05 08:00:19.603 [INFO][18999] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-22-5f6d6bf7c5-zrlqj", ContainerID:"05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275"}}
I0505 08:00:19.613756 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:19.613850 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:19.613901 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:19.613945 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:19.629978 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:19.630254 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:19.630352 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:19.630442 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
2020-05-05 08:00:19.643 [INFO][18999] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0 rook-ceph-osd-22-5f6d6bf7c5- rook c6ed9db1-a2cd-4864-8270-6070bb164ad7 13579 0 2020-05-05 08:00:18 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:22 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:5f6d6bf7c5 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-22-5f6d6bf7c5-zrlqj eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali57c8253867b []}} ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Namespace="rook" Pod="rook-ceph-osd-22-5f6d6bf7c5-zrlqj" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-"
2020-05-05 08:00:19.644 [INFO][18999] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Namespace="rook" Pod="rook-ceph-osd-22-5f6d6bf7c5-zrlqj" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0"
2020-05-05 08:00:19.651 [INFO][18999] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:19.657 [INFO][18999] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-22-5f6d6bf7c5-zrlqj rook-ceph-osd-22-5f6d6bf7c5- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-22-5f6d6bf7c5-zrlqj c6ed9db1-a2cd-4864-8270-6070bb164ad7 13579 0 2020-05-05 08:00:18 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:22 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:5f6d6bf7c5 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-22-5f6d6bf7c5 fa7b4688-48c3-4b61-b190-cbb314888020 0xc00055c3d7 0xc00055c3d8}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 22 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:7c362200-1047-4529-8e57-5da81f5be428,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:22,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-22,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.22.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=22
OSD_UUID=7c362200-1047-4529-8e57-5da81f5be428
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-22,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-22,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:18 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:,PodIP:,StartTime:<nil>,ContainerStatuses:[]ContainerStatus{},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:19.678 [INFO][19049] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" HandleID="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0"
2020-05-05 08:00:19.701 [INFO][19049] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275 ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" HandleID="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0"
2020-05-05 08:00:19.701 [INFO][19049] ipam_plugin.go 233: Auto assigning IP ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" HandleID="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00001ee40), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-22-5f6d6bf7c5-zrlqj"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:19.701 [INFO][19049] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:19.710 [INFO][19049] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.718 [INFO][19049] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.722 [INFO][19049] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.728 [INFO][19049] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.728 [INFO][19049] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.731 [INFO][19049] ipam.go 1265: Creating new handle: k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275
2020-05-05 08:00:19.738 [INFO][19049] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.748 [INFO][19049] ipam.go 840: Successfully claimed IPs: [10.2.76.19/24] block=10.2.76.0/24 handle="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.748 [INFO][19049] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.19/24] handle="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.752 [INFO][19049] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.19/24] handle="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:19.752 [INFO][19049] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.19/24] IPv6=[] ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" HandleID="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0"
2020-05-05 08:00:19.752 [INFO][19049] ipam_plugin.go 261: IPAM Result ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" HandleID="k8s-pod-network.05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0004082a0)}
2020-05-05 08:00:19.753 [INFO][18999] k8s.go 358: Populated endpoint ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Namespace="rook" Pod="rook-ceph-osd-22-5f6d6bf7c5-zrlqj" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0", GenerateName:"rook-ceph-osd-22-5f6d6bf7c5-", Namespace:"rook", SelfLink:"", UID:"c6ed9db1-a2cd-4864-8270-6070bb164ad7", ResourceVersion:"13579", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262418, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"22", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"5f6d6bf7c5", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-22-5f6d6bf7c5-zrlqj", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.19/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali57c8253867b", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:19.753 [INFO][18999] k8s.go 359: Calico CNI using IPs: [10.2.76.19/32] ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Namespace="rook" Pod="rook-ceph-osd-22-5f6d6bf7c5-zrlqj" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0"
2020-05-05 08:00:19.753 [INFO][18999] network_linux.go 76: Setting the host side veth name to cali57c8253867b ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Namespace="rook" Pod="rook-ceph-osd-22-5f6d6bf7c5-zrlqj" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0"
2020-05-05 08:00:19.755 [INFO][18999] network_linux.go 396: Disabling IPv4 forwarding ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Namespace="rook" Pod="rook-ceph-osd-22-5f6d6bf7c5-zrlqj" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0"
2020-05-05 08:00:19.774 [INFO][18999] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Namespace="rook" Pod="rook-ceph-osd-22-5f6d6bf7c5-zrlqj" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0", GenerateName:"rook-ceph-osd-22-5f6d6bf7c5-", Namespace:"rook", SelfLink:"", UID:"c6ed9db1-a2cd-4864-8270-6070bb164ad7", ResourceVersion:"13579", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262418, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"22", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"5f6d6bf7c5", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275", Pod:"rook-ceph-osd-22-5f6d6bf7c5-zrlqj", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.19/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali57c8253867b", MAC:"16:0e:f3:92:ed:18", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:19.783 [INFO][18999] k8s.go 417: Wrote updated endpoint to datastore ContainerID="05e22a4d7656b255a3aa0ba1b17473e9e9d721ce6481978fd9817323cff38275" Namespace="rook" Pod="rook-ceph-osd-22-5f6d6bf7c5-zrlqj" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--22--5f6d6bf7c5--zrlqj-eth0"
I0505 08:00:20.122474 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:20.216131 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/944f28aa-adf9-4869-a2fb-75edc771bc63-rook-ceph-log") pod "rook-ceph-osd-37-7d4655d857-x6thz" (UID: "944f28aa-adf9-4869-a2fb-75edc771bc63")
I0505 08:00:20.216216 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/944f28aa-adf9-4869-a2fb-75edc771bc63-rook-ceph-crash") pod "rook-ceph-osd-37-7d4655d857-x6thz" (UID: "944f28aa-adf9-4869-a2fb-75edc771bc63")
I0505 08:00:20.216305 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/944f28aa-adf9-4869-a2fb-75edc771bc63-rook-config-override") pod "rook-ceph-osd-37-7d4655d857-x6thz" (UID: "944f28aa-adf9-4869-a2fb-75edc771bc63")
I0505 08:00:20.216355 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/944f28aa-adf9-4869-a2fb-75edc771bc63-run-udev") pod "rook-ceph-osd-37-7d4655d857-x6thz" (UID: "944f28aa-adf9-4869-a2fb-75edc771bc63")
I0505 08:00:20.216406 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/944f28aa-adf9-4869-a2fb-75edc771bc63-devices") pod "rook-ceph-osd-37-7d4655d857-x6thz" (UID: "944f28aa-adf9-4869-a2fb-75edc771bc63")
I0505 08:00:20.216488 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/944f28aa-adf9-4869-a2fb-75edc771bc63-activate-osd") pod "rook-ceph-osd-37-7d4655d857-x6thz" (UID: "944f28aa-adf9-4869-a2fb-75edc771bc63")
I0505 08:00:20.216519 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/944f28aa-adf9-4869-a2fb-75edc771bc63-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-37-7d4655d857-x6thz" (UID: "944f28aa-adf9-4869-a2fb-75edc771bc63")
I0505 08:00:20.216561 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/944f28aa-adf9-4869-a2fb-75edc771bc63-rook-data") pod "rook-ceph-osd-37-7d4655d857-x6thz" (UID: "944f28aa-adf9-4869-a2fb-75edc771bc63")
I0505 08:00:20.725648 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:20.725779 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:20.725831 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:20.725879 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
W0505 08:00:21.007891 2827 pod_container_deletor.go:77] Container "e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" not found in pod's containers
2020-05-05 08:00:21.031 [INFO][19821] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-37-7d4655d857-x6thz", ContainerID:"e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f"}}
2020-05-05 08:00:21.066 [INFO][19821] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0 rook-ceph-osd-37-7d4655d857- rook 944f28aa-adf9-4869-a2fb-75edc771bc63 13617 0 2020-05-05 08:00:20 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:37 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:7d4655d857 portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-37-7d4655d857-x6thz eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali9e6f92ef9e6 []}} ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Namespace="rook" Pod="rook-ceph-osd-37-7d4655d857-x6thz" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-"
2020-05-05 08:00:21.066 [INFO][19821] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Namespace="rook" Pod="rook-ceph-osd-37-7d4655d857-x6thz" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0"
2020-05-05 08:00:21.071 [INFO][19821] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:21.083 [INFO][19821] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-37-7d4655d857-x6thz rook-ceph-osd-37-7d4655d857- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-37-7d4655d857-x6thz 944f28aa-adf9-4869-a2fb-75edc771bc63 13617 0 2020-05-05 08:00:20 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:37 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:7d4655d857 portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-37-7d4655d857 cfd1826a-0d26-44a9-9fd0-4be00690eaea 0xc000469157 0xc000469158}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 37 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:82f4a0d4-3fae-4f27-8846-49cc490acb7d,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:37,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-37,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.37.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=37
OSD_UUID=82f4a0d4-3fae-4f27-8846-49cc490acb7d
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-37,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-37,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:20 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:,PodIP:,StartTime:<nil>,ContainerStatuses:[]ContainerStatus{},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:21.101 [INFO][19896] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" HandleID="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0"
2020-05-05 08:00:21.125 [INFO][19896] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" HandleID="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0"
2020-05-05 08:00:21.125 [INFO][19896] ipam_plugin.go 233: Auto assigning IP ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" HandleID="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ab0c0), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-37-7d4655d857-x6thz"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:21.125 [INFO][19896] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:21.133 [INFO][19896] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.141 [INFO][19896] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.145 [INFO][19896] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.151 [INFO][19896] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.151 [INFO][19896] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.154 [INFO][19896] ipam.go 1265: Creating new handle: k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f
2020-05-05 08:00:21.160 [INFO][19896] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.172 [INFO][19896] ipam.go 840: Successfully claimed IPs: [10.2.76.20/24] block=10.2.76.0/24 handle="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.172 [INFO][19896] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.20/24] handle="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.176 [INFO][19896] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.20/24] handle="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:21.176 [INFO][19896] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.20/24] IPv6=[] ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" HandleID="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0"
2020-05-05 08:00:21.176 [INFO][19896] ipam_plugin.go 261: IPAM Result ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" HandleID="k8s-pod-network.e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc00002fce0)}
2020-05-05 08:00:21.177 [INFO][19821] k8s.go 358: Populated endpoint ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Namespace="rook" Pod="rook-ceph-osd-37-7d4655d857-x6thz" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0", GenerateName:"rook-ceph-osd-37-7d4655d857-", Namespace:"rook", SelfLink:"", UID:"944f28aa-adf9-4869-a2fb-75edc771bc63", ResourceVersion:"13617", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262420, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"37", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"7d4655d857", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-37-7d4655d857-x6thz", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.20/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali9e6f92ef9e6", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:21.177 [INFO][19821] k8s.go 359: Calico CNI using IPs: [10.2.76.20/32] ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Namespace="rook" Pod="rook-ceph-osd-37-7d4655d857-x6thz" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0"
2020-05-05 08:00:21.177 [INFO][19821] network_linux.go 76: Setting the host side veth name to cali9e6f92ef9e6 ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Namespace="rook" Pod="rook-ceph-osd-37-7d4655d857-x6thz" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0"
2020-05-05 08:00:21.178 [INFO][19821] network_linux.go 396: Disabling IPv4 forwarding ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Namespace="rook" Pod="rook-ceph-osd-37-7d4655d857-x6thz" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0"
2020-05-05 08:00:21.196 [INFO][19821] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Namespace="rook" Pod="rook-ceph-osd-37-7d4655d857-x6thz" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0", GenerateName:"rook-ceph-osd-37-7d4655d857-", Namespace:"rook", SelfLink:"", UID:"944f28aa-adf9-4869-a2fb-75edc771bc63", ResourceVersion:"13617", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262420, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"37", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"7d4655d857", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f", Pod:"rook-ceph-osd-37-7d4655d857-x6thz", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.20/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali9e6f92ef9e6", MAC:"76:2a:0d:1f:3a:46", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:21.204 [INFO][19821] k8s.go 417: Wrote updated endpoint to datastore ContainerID="e9755f7121fd15eaa9c577cffe6a5b80826b8e876dd77f2af9151bc3dfcde18f" Namespace="rook" Pod="rook-ceph-osd-37-7d4655d857-x6thz" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--37--7d4655d857--x6thz-eth0"
I0505 08:00:21.317996 2827 topology_manager.go:233] [topologymanager] Topology Admit Handler
I0505 08:00:21.425302 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-log" (UniqueName: "kubernetes.io/host-path/6c429d79-809c-4e31-8c27-f993f0efa510-rook-ceph-log") pod "rook-ceph-osd-4-ddc58c59c-wwfvd" (UID: "6c429d79-809c-4e31-8c27-f993f0efa510")
I0505 08:00:21.425364 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "activate-osd" (UniqueName: "kubernetes.io/empty-dir/6c429d79-809c-4e31-8c27-f993f0efa510-activate-osd") pod "rook-ceph-osd-4-ddc58c59c-wwfvd" (UID: "6c429d79-809c-4e31-8c27-f993f0efa510")
I0505 08:00:21.425490 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-data" (UniqueName: "kubernetes.io/host-path/6c429d79-809c-4e31-8c27-f993f0efa510-rook-data") pod "rook-ceph-osd-4-ddc58c59c-wwfvd" (UID: "6c429d79-809c-4e31-8c27-f993f0efa510")
I0505 08:00:21.425564 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "run-udev" (UniqueName: "kubernetes.io/host-path/6c429d79-809c-4e31-8c27-f993f0efa510-run-udev") pod "rook-ceph-osd-4-ddc58c59c-wwfvd" (UID: "6c429d79-809c-4e31-8c27-f993f0efa510")
I0505 08:00:21.425629 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-osd-token-4dtzj" (UniqueName: "kubernetes.io/secret/6c429d79-809c-4e31-8c27-f993f0efa510-rook-ceph-osd-token-4dtzj") pod "rook-ceph-osd-4-ddc58c59c-wwfvd" (UID: "6c429d79-809c-4e31-8c27-f993f0efa510")
I0505 08:00:21.425730 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-config-override" (UniqueName: "kubernetes.io/configmap/6c429d79-809c-4e31-8c27-f993f0efa510-rook-config-override") pod "rook-ceph-osd-4-ddc58c59c-wwfvd" (UID: "6c429d79-809c-4e31-8c27-f993f0efa510")
I0505 08:00:21.425892 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "devices" (UniqueName: "kubernetes.io/host-path/6c429d79-809c-4e31-8c27-f993f0efa510-devices") pod "rook-ceph-osd-4-ddc58c59c-wwfvd" (UID: "6c429d79-809c-4e31-8c27-f993f0efa510")
I0505 08:00:21.426021 2827 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "rook-ceph-crash" (UniqueName: "kubernetes.io/host-path/6c429d79-809c-4e31-8c27-f993f0efa510-rook-ceph-crash") pod "rook-ceph-osd-4-ddc58c59c-wwfvd" (UID: "6c429d79-809c-4e31-8c27-f993f0efa510")
W0505 08:00:22.247137 2827 pod_container_deletor.go:77] Container "aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" not found in pod's containers
I0505 08:00:22.266077 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:22.266210 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:22.266303 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:22.266354 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
2020-05-05 08:00:22.268 [INFO][20376] plugin.go 129: Extracted identifiers EndpointIDs=&utils.WEPIdentifiers{Namespace:"rook", WEPName:"", WorkloadEndpointIdentifiers:names.WorkloadEndpointIdentifiers{Node:"suraj-lk-cluster-pool-1-worker-2", Orchestrator:"k8s", Endpoint:"eth0", Workload:"", Pod:"rook-ceph-osd-4-ddc58c59c-wwfvd", ContainerID:"aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f"}}
2020-05-05 08:00:22.300 [INFO][20376] plugin.go 220: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0 rook-ceph-osd-4-ddc58c59c- rook 6c429d79-809c-4e31-8c27-f993f0efa510 13648 0 2020-05-05 08:00:21 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:4 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:ddc58c59c portable:false projectcalico.org/namespace:rook projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:rook-ceph-osd rook_cluster:rook] map[] [] [] []} {k8s suraj-lk-cluster-pool-1-worker-2 rook-ceph-osd-4-ddc58c59c-wwfvd eth0 [] [] [kns.rook ksa.rook.rook-ceph-osd] cali30c8a1a79bc []}} ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Namespace="rook" Pod="rook-ceph-osd-4-ddc58c59c-wwfvd" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-"
2020-05-05 08:00:22.300 [INFO][20376] k8s.go 60: Extracted identifiers for CmdAddK8s ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Namespace="rook" Pod="rook-ceph-osd-4-ddc58c59c-wwfvd" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0"
2020-05-05 08:00:22.307 [INFO][20376] k8s.go 775: namespace info &Namespace{ObjectMeta:{rook /api/v1/namespaces/rook 346082b0-52fc-4520-8796-5afff38e4f6c 7478 0 2020-05-05 07:45:13 +0000 UTC <nil> <nil> map[] map[] [] [] [{lokoctl Update v1 2020-05-05 07:45:13 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 112 104 97 115 101 34 58 123 125 125 125],}}]},Spec:NamespaceSpec{Finalizers:[kubernetes],},Status:NamespaceStatus{Phase:Active,Conditions:[]NamespaceCondition{},},}
2020-05-05 08:00:22.311 [INFO][20376] k8s.go 784: pod info &Pod{ObjectMeta:{rook-ceph-osd-4-ddc58c59c-wwfvd rook-ceph-osd-4-ddc58c59c- rook /api/v1/namespaces/rook/pods/rook-ceph-osd-4-ddc58c59c-wwfvd 6c429d79-809c-4e31-8c27-f993f0efa510 13648 0 2020-05-05 08:00:21 +0000 UTC <nil> <nil> map[app:rook-ceph-osd ceph-osd-id:4 failure-domain:suraj-lk-cluster-pool-1-worker-2 pod-template-hash:ddc58c59c portable:false rook_cluster:rook] map[kubernetes.io/psp:00-rook-ceph-operator seccomp.security.alpha.kubernetes.io/pod:runtime/default] [{apps/v1 ReplicaSet rook-ceph-osd-4-ddc58c59c 238f7d8f-8571-47bf-b900-5a91a14e07a3 0xc00042e197 0xc00042e198}] [] []},Spec:PodSpec{Volumes:[]Volume{Volume{Name:rook-data,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-config-override,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:&ConfigMapVolumeSource{LocalObjectReference:LocalObjectReference{Name:rook-config-override,},Items:[]KeyToPath{KeyToPath{Key:config,Path:ceph.conf,Mode:*292,},},DefaultMode:*420,Optional:nil,},VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-log,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/log,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-crash,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/var/lib/rook/rook/crash,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:devices,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/dev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:run-udev,VolumeSource:VolumeSource{HostPath:&HostPathVolumeSource{Path:/run/udev,Type:*,},EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:activate-osd,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:&EmptyDirVolumeSource{Medium:,SizeLimit:<nil>,},GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},Volume{Name:rook-ceph-osd-token-4dtzj,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:&SecretVolumeSource{SecretName:rook-ceph-osd-token-4dtzj,Items:[]KeyToPath{},DefaultMode:*420,Optional:nil,},NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:nil,StorageOS:nil,CSI:nil,},},},Containers:[]Container{Container{Name:osd,Image:ceph/ceph:v14.2.8,Command:[ceph-osd],Args:[--foreground --id 4 --fsid 504a40d4-a2d3-4492-a37d-27a58464958f --setuser ceph --setgroup ceph --crush-location=root=default host=suraj-lk-cluster-pool-1-worker-2 --default-log-to-file false --ms-learn-addr-from-peer=false],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ROOK_NODE_NAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:ROOK_CLUSTER_ID,Value:77795d4f-fb40-4886-bc74-cb2065f0c61f,ValueFrom:nil,},EnvVar{Name:ROOK_PRIVATE_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_PUBLIC_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CLUSTER_NAME,Value:rook,ValueFrom:nil,},EnvVar{Name:ROOK_MON_ENDPOINTS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon-endpoints,},Key:data,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:ROOK_MON_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:mon-secret,Optional:nil,},},},EnvVar{Name:ROOK_ADMIN_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:admin-secret,Optional:nil,},},},EnvVar{Name:ROOK_CONFIG_DIR,Value:/var/lib/rook,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_CONFIG_OVERRIDE,Value:/etc/rook/config/override.conf,ValueFrom:nil,},EnvVar{Name:ROOK_FSID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-mon,},Key:fsid,Optional:nil,},},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_CRUSHMAP_HOSTNAME,Value:suraj-lk-cluster-pool-1-worker-2,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_OSDS_PER_DEVICE,Value:1,ValueFrom:nil,},EnvVar{Name:TINI_SUBREAPER,Value:,ValueFrom:nil,},EnvVar{Name:CONTAINER_IMAGE,Value:ceph/ceph:v14.2.8,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_MEMORY_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.memory,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_LIMIT,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:limits.cpu,Divisor:{{1 0} {<nil>} 1 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_CPU_REQUEST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:&ResourceFieldSelector{ContainerName:,Resource:requests.cpu,Divisor:{{0 0} {<nil>} 0 DecimalSI},},ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ROOK_OSD_UUID,Value:c541a775-f184-43a4-955b-19352a088ee0,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_ID,Value:4,ValueFrom:nil,},EnvVar{Name:ROOK_OSD_STORE_TYPE,Value:bluestore,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-4,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:&Probe{Handler:Handler{Exec:&ExecAction{Command:[env -i sh -c ceph --admin-daemon /run/ceph/ceph-osd.4.asok status],},HTTPGet:nil,TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*30,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{kubernetes.io/hostname: suraj-lk-cluster-pool-1-worker-2,},ServiceAccountName:rook-ceph-osd,DeprecatedServiceAccount:rook-ceph-osd,NodeName:suraj-lk-cluster-pool-1-worker-2,HostNetwork:false,HostPID:true,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:&Affinity{NodeAffinity:&NodeAffinity{RequiredDuringSchedulingIgnoredDuringExecution:&NodeSelector{NodeSelectorTerms:[]NodeSelectorTerm{NodeSelectorTerm{MatchExpressions:[]NodeSelectorRequirement{NodeSelectorRequirement{Key:storage.lokomotive.io,Operator:Exists,Values:[],},},MatchFields:[]NodeSelectorRequirement{},},},},PreferredDuringSchedulingIgnoredDuringExecution:[]PreferredSchedulingTerm{},},PodAffinity:nil,PodAntiAffinity:nil,},SchedulerName:default-scheduler,InitContainers:[]Container{Container{Name:activate,Image:ceph/ceph:v14.2.8,Command:[/bin/bash -c
set -ex
OSD_ID=4
OSD_UUID=c541a775-f184-43a4-955b-19352a088ee0
OSD_STORE_FLAG="--bluestore"
OSD_DATA_DIR=/var/lib/ceph/osd/ceph-"$OSD_ID"
CV_MODE=lvm
DEVICE=
METADATA_DEVICE="$ROOK_METADATA_DEVICE"
# active the osd with ceph-volume
if [[ "$CV_MODE" == "lvm" ]]; then
TMP_DIR=$(mktemp -d)
# activate osd
ceph-volume "$CV_MODE" activate --no-systemd "$OSD_STORE_FLAG" "$OSD_ID" "$OSD_UUID"
# copy the tmpfs directory to a temporary directory
# this is needed because when the init container exits, the tmpfs goes away and its content with it
# this will result in the emptydir to be empty when accessed by the main osd container
cp --verbose --no-dereference "$OSD_DATA_DIR"/* "$TMP_DIR"/
# unmount the tmpfs since we don't need it anymore
umount "$OSD_DATA_DIR"
# copy back the content of the tmpfs into the original osd directory
cp --verbose --no-dereference "$TMP_DIR"/* "$OSD_DATA_DIR"
# retain ownership of files to the ceph user/group
chown --verbose --recursive ceph:ceph "$OSD_DATA_DIR"
# remove the temporary directory
rm --recursive --force "$TMP_DIR"
else
ARGS=(--device ${DEVICE} --no-systemd --no-tmpfs)
if [ -n "$METADATA_DEVICE" ]; then
ARGS+=(--block.db ${METADATA_DEVICE})
fi
# ceph-volume raw mode only supports bluestore so we don't need to pass a store flag
ceph-volume "$CV_MODE" activate "${ARGS[@]}"
fi
],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CEPH_VOLUME_DEBUG,Value:1,ValueFrom:nil,},EnvVar{Name:CEPH_VOLUME_SKIP_RESTORECON,Value:1,ValueFrom:nil,},EnvVar{Name:DM_DISABLE_UDEV,Value:1,ValueFrom:nil,},EnvVar{Name:ROOK_CEPH_MON_HOST,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:rook-ceph-config,},Key:mon_host,Optional:nil,},},},EnvVar{Name:CEPH_ARGS,Value:-m $(ROOK_CEPH_MON_HOST),ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-4,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},Container{Name:chown-container-data-dir,Image:ceph/ceph:v14.2.8,Command:[chown],Args:[--verbose --recursive ceph:ceph /var/log/ceph /var/lib/ceph/crash],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rook-data,ReadOnly:false,MountPath:/var/lib/rook,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-config-override,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-crash,ReadOnly:false,MountPath:/var/lib/ceph/crash,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:devices,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:run-udev,ReadOnly:false,MountPath:/run/udev,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:activate-osd,ReadOnly:false,MountPath:/var/lib/ceph/osd/ceph-4,SubPath:,MountPropagation:nil,SubPathExpr:,},VolumeMount{Name:rook-ceph-osd-token-4dtzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:storage.lokomotive.io,Operator:Equal,Value:ceph,Effect:NoSchedule,TolerationSeconds:nil,},Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:nil,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},},Status:PodStatus{Phase:Pending,Conditions:[]PodCondition{PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2020-05-05 08:00:21 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:,PodIP:,StartTime:<nil>,ContainerStatuses:[]ContainerStatus{},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{},EphemeralContainerStatuses:[]ContainerStatus{},},}
2020-05-05 08:00:22.335 [INFO][20427] ipam_plugin.go 211: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" HandleID="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0"
2020-05-05 08:00:22.357 [INFO][20427] ipam_plugin.go 223: Calico CNI IPAM handle=k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" HandleID="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0"
2020-05-05 08:00:22.358 [INFO][20427] ipam_plugin.go 233: Auto assigning IP ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" HandleID="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4d40), Attrs:map[string]string{"namespace":"rook", "node":"suraj-lk-cluster-pool-1-worker-2", "pod":"rook-ceph-osd-4-ddc58c59c-wwfvd"}, Hostname:"suraj-lk-cluster-pool-1-worker-2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0}
2020-05-05 08:00:22.358 [INFO][20427] ipam.go 87: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'suraj-lk-cluster-pool-1-worker-2'
2020-05-05 08:00:22.367 [INFO][20427] ipam.go 313: Looking up existing affinities for host handle="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.374 [INFO][20427] ipam.go 381: Trying affinity for 10.2.76.0/24 handle="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.379 [INFO][20427] ipam.go 135: Attempting to load block cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.385 [INFO][20427] ipam.go 212: Affinity is confirmed and block has been loaded cidr=10.2.76.0/24 host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.385 [INFO][20427] ipam.go 804: Attempting to assign 1 addresses from block block=10.2.76.0/24 handle="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.388 [INFO][20427] ipam.go 1265: Creating new handle: k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f
2020-05-05 08:00:22.396 [INFO][20427] ipam.go 827: Writing block in order to claim IPs block=10.2.76.0/24 handle="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.409 [INFO][20427] ipam.go 840: Successfully claimed IPs: [10.2.76.21/24] block=10.2.76.0/24 handle="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.410 [INFO][20427] ipam.go 413: Block '10.2.76.0/24' provided addresses: [10.2.76.21/24] handle="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.413 [INFO][20427] ipam.go 569: Auto-assigned 1 out of 1 IPv4s: [10.2.76.21/24] handle="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" host="suraj-lk-cluster-pool-1-worker-2"
2020-05-05 08:00:22.413 [INFO][20427] ipam_plugin.go 235: Calico CNI IPAM assigned addresses IPv4=[10.2.76.21/24] IPv6=[] ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" HandleID="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0"
2020-05-05 08:00:22.413 [INFO][20427] ipam_plugin.go 261: IPAM Result ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" HandleID="k8s-pod-network.aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Workload="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0" result.IPs=[]*current.IPConfig{(*current.IPConfig)(0xc0002c6120)}
2020-05-05 08:00:22.414 [INFO][20376] k8s.go 358: Populated endpoint ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Namespace="rook" Pod="rook-ceph-osd-4-ddc58c59c-wwfvd" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0", GenerateName:"rook-ceph-osd-4-ddc58c59c-", Namespace:"rook", SelfLink:"", UID:"6c429d79-809c-4e31-8c27-f993f0efa510", ResourceVersion:"13648", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262421, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"4", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"ddc58c59c", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"", Pod:"rook-ceph-osd-4-ddc58c59c-wwfvd", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.21/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali30c8a1a79bc", MAC:"", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:22.415 [INFO][20376] k8s.go 359: Calico CNI using IPs: [10.2.76.21/32] ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Namespace="rook" Pod="rook-ceph-osd-4-ddc58c59c-wwfvd" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0"
2020-05-05 08:00:22.415 [INFO][20376] network_linux.go 76: Setting the host side veth name to cali30c8a1a79bc ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Namespace="rook" Pod="rook-ceph-osd-4-ddc58c59c-wwfvd" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0"
2020-05-05 08:00:22.416 [INFO][20376] network_linux.go 396: Disabling IPv4 forwarding ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Namespace="rook" Pod="rook-ceph-osd-4-ddc58c59c-wwfvd" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0"
2020-05-05 08:00:22.433 [INFO][20376] k8s.go 385: Added Mac, interface name, and active container ID to endpoint ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Namespace="rook" Pod="rook-ceph-osd-4-ddc58c59c-wwfvd" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0", GenerateName:"rook-ceph-osd-4-ddc58c59c-", Namespace:"rook", SelfLink:"", UID:"6c429d79-809c-4e31-8c27-f993f0efa510", ResourceVersion:"13648", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63724262421, loc:(*time.Location)(0x299c340)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"rook-ceph-osd", "ceph-osd-id":"4", "failure-domain":"suraj-lk-cluster-pool-1-worker-2", "pod-template-hash":"ddc58c59c", "portable":"false", "projectcalico.org/namespace":"rook", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"rook-ceph-osd", "rook_cluster":"rook"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"suraj-lk-cluster-pool-1-worker-2", ContainerID:"aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f", Pod:"rook-ceph-osd-4-ddc58c59c-wwfvd", Endpoint:"eth0", IPNetworks:[]string{"10.2.76.21/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.rook", "ksa.rook.rook-ceph-osd"}, InterfaceName:"cali30c8a1a79bc", MAC:"ca:74:26:d3:0a:9b", Ports:[]v3.EndpointPort(nil)}}
2020-05-05 08:00:22.442 [INFO][20376] k8s.go 417: Wrote updated endpoint to datastore ContainerID="aabaf682a50befd5f7244ae4b8e74a305c98d4ae910ed61a756daf27718c568f" Namespace="rook" Pod="rook-ceph-osd-4-ddc58c59c-wwfvd" WorkloadEndpoint="suraj--lk--cluster--pool--1--worker--2-k8s-rook--ceph--osd--4--ddc58c59c--wwfvd-eth0"
I0505 08:00:23.336612 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:23.336855 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:23.336976 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:23.337074 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:24.420327 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:24.420431 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:24.420480 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:24.420527 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:25.491812 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:25.491986 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:25.492091 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:25.492148 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:26.535435 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:26.535613 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:26.535763 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:26.535866 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:27.623574 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:27.623684 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:27.624226 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:27.624284 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:28.674770 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:28.674865 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:28.674912 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
I0505 08:00:28.674955 2827 kubelet_resources.go:45] allocatable: map[cpu:{{32 0} {<nil>} 32 DecimalSI} ephemeral-storage:{{108249701812 0} {<nil>} 108249701812 DecimalSI} hugepages-1Gi:{{0 0} {<nil>} 0 DecimalSI} hugepages-2Mi:{{0 0} {<nil>} 0 DecimalSI} memory:{{135041626112 0} {<nil>} 131876588Ki BinarySI} pods:{{110 0} {<nil>} 110 DecimalSI}]
fatal error: unexpected signal during runtime execution
[signal SIGSEGV: segmentation violation code=0x80 addr=0x0 pc=0x40e1ce]
goroutine 31272 [running]:
runtime.throw(0x43b04c8, 0x2a)
/usr/local/go/src/runtime/panic.go:774 +0x72 fp=0xc000893a10 sp=0xc0008939e0 pc=0x432eb2
runtime.sigpanic()
/usr/local/go/src/runtime/signal_unix.go:378 +0x47c fp=0xc000893a40 sp=0xc000893a10 pc=0x44858c
runtime.lock(0x4a656c69460412d1)
/usr/local/go/src/runtime/lock_futex.go:55 +0x4e fp=0xc000893a88 sp=0xc000893a40 pc=0x40e1ce
runtime.sellock(0xc000893ce8, 0x3, 0x3, 0xc000893c4a, 0x3, 0x3)
/usr/local/go/src/runtime/select.go:51 +0x76 fp=0xc000893ab8 sp=0xc000893a88 pc=0x4438d6
runtime.selectgo(0xc000893ce8, 0xc000893c44, 0x3, 0x0, 0x0)
/usr/local/go/src/runtime/select.go:315 +0xcd5 fp=0xc000893be0 sp=0xc000893ab8 pc=0x444765
k8s.io/kubernetes/vendor/github.com/google/cadvisor/manager.(*containerData).housekeepingTick(0xc0002e8480, 0xc000c27560, 0x5f5e100, 0xc000b80500)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/google/cadvisor/manager/container.go:513 +0x10e fp=0xc000893d70 sp=0xc000893be0 pc=0x1b1294e
k8s.io/kubernetes/vendor/github.com/google/cadvisor/manager.(*containerData).housekeeping(0xc0002e8480)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/google/cadvisor/manager/container.go:471 +0x235 fp=0xc000893fd8 sp=0xc000893d70 pc=0x1b12105
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1357 +0x1 fp=0xc000893fe0 sp=0xc000893fd8 pc=0x462c41
created by k8s.io/kubernetes/vendor/github.com/google/cadvisor/manager.(*containerData).Start
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/google/cadvisor/manager/container.go:108 +0x3f
goroutine 1 [select, 55 minutes]:
k8s.io/kubernetes/cmd/kubelet/app.run(0xc000295800, 0xc0000efe00, 0x7fe5b97fc038, 0xc0001a2100, 0xc0000fa780, 0x1, 0x1)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:806 +0x985
k8s.io/kubernetes/cmd/kubelet/app.Run(0xc000295800, 0xc0000efe00, 0x7fe5b97fc038, 0xc0001a2100, 0xc0000fa780, 0x0, 0xc000677920)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:421 +0x12a
k8s.io/kubernetes/cmd/kubelet/app.NewKubeletCommand.func1(0xc000754780, 0xc00004e130, 0x10, 0x11)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:273 +0x5b1
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc000754780, 0xc00004e130, 0x10, 0x11, 0xc000754780, 0xc00004e130)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:830 +0x2aa
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc000754780, 0x160c0ff7e96eb022, 0x701b4a0, 0xc000000180)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:914 +0x2fb
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:864
main.main()
_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubelet/kubelet.go:41 +0xcd
goroutine 6 [chan receive]:
k8s.io/kubernetes/vendor/k8s.io/klog.(*loggingT).flushDaemon(0x701bae0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/klog.go:1010 +0x8b
created by k8s.io/kubernetes/vendor/k8s.io/klog.init.0
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/klog.go:411 +0xd6
goroutine 132 [sleep]:
runtime.goparkunlock(...)
/usr/local/go/src/runtime/proc.go:310
time.Sleep(0x3b9aca00)
/usr/local/go/src/runtime/time.go:105 +0x157
k8s.io/kubernetes/pkg/scheduler/framework/v1alpha1.(*metricsRecorder).run(0xc00073aea0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/scheduler/framework/v1alpha1/metrics_recorder.go:87 +0x3f
created by k8s.io/kubernetes/pkg/scheduler/framework/v1alpha1.newMetricsRecorder
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/scheduler/framework/v1alpha1/metrics_recorder.go:59 +0x104
goroutine 149 [select]:
k8s.io/kubernetes/vendor/go.opencensus.io/stats/view.(*worker).start(0xc0004c6af0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/go.opencensus.io/stats/view/worker.go:154 +0x100
created by k8s.io/kubernetes/vendor/go.opencensus.io/stats/view.init.0
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/go.opencensus.io/stats/view/worker.go:32 +0x57
goroutine 218 [IO wait, 55 minutes]:
internal/poll.runtime_pollWait(0x7fe5b97fba88, 0x72, 0x0)
/usr/local/go/src/runtime/netpoll.go:184 +0x55
internal/poll.(*pollDesc).wait(0xc000ce4118, 0x72, 0x0, 0x0, 0x4311d43)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45
internal/poll.(*pollDesc).waitRead(...)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0xc000ce4100, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:384 +0x1f8
net.(*netFD).accept(0xc000ce4100, 0xc0000cbd30, 0xc000162000, 0x7fe5b98ced98)
/usr/local/go/src/net/fd_unix.go:238 +0x42
net.(*TCPListener).accept(0xc0004fe8e0, 0xc0000cbd60, 0x4102a8, 0x30)
/usr/local/go/src/net/tcpsock_posix.go:139 +0x32
net.(*TCPListener).Accept(0xc0004fe8e0, 0x3f398c0, 0xc000c421e0, 0x3a1c160, 0x6f10fc0)
/usr/local/go/src/net/tcpsock.go:261 +0x47
net/http.(*Server).Serve(0xc000b2e460, 0x4b1d8c0, 0xc0004fe8e0, 0x0, 0x0)
/usr/local/go/src/net/http/server.go:2896 +0x280
k8s.io/kubernetes/pkg/kubelet/server/streaming.(*server).Start(0xc0009a6630, 0x43af01, 0x44ad760, 0xc00049ffb8)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/server/streaming/server.go:248 +0x147
k8s.io/kubernetes/pkg/kubelet/dockershim.(*dockerService).Start.func1(0xc000baa8f0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/dockershim/docker_service.go:411 +0x3d
created by k8s.io/kubernetes/pkg/kubelet/dockershim.(*dockerService).Start
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/dockershim/docker_service.go:410 +0x86
goroutine 158 [syscall, 55 minutes]:
os/signal.signal_recv(0x462c46)
/usr/local/go/src/runtime/sigqueue.go:147 +0x9c
os/signal.loop()
/usr/local/go/src/os/signal/signal_unix.go:23 +0x22
created by os/signal.init.0
/usr/local/go/src/os/signal/signal_unix.go:29 +0x41
goroutine 159 [select]:
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x44acd88, 0x4aadcc0, 0xc00066a3c0, 0xc000480001, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x13f
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x44acd88, 0x12a05f200, 0x0, 0x1, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0xe2
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(...)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Forever(0x44acd88, 0x12a05f200)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:81 +0x4f
created by k8s.io/kubernetes/vendor/k8s.io/component-base/logs.InitLogs
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/logs/logs.go:58 +0x8a
goroutine 161 [chan receive, 55 minutes]:
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.SetupSignalHandler.func1(0xc0000fa780)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/signal.go:38 +0x36
created by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.SetupSignalHandler
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/signal.go:37 +0xdc
goroutine 194 [chan receive]:
k8s.io/kubernetes/vendor/k8s.io/client-go/util/workqueue.(*Type).updateUnfinishedWorkLoop(0xc0000e36e0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/util/workqueue/queue.go:198 +0xe0
created by k8s.io/kubernetes/vendor/k8s.io/client-go/util/workqueue.newQueue
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/util/workqueue/queue.go:58 +0x132
goroutine 195 [select]:
k8s.io/kubernetes/vendor/k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0000e3800)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/util/workqueue/delaying_queue.go:222 +0x405
created by k8s.io/kubernetes/vendor/k8s.io/client-go/util/workqueue.NewDelayingQueueWithCustomClock
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/util/workqueue/delaying_queue.go:58 +0x1ae
goroutine 207 [syscall, 55 minutes]:
syscall.Syscall6(0xe8, 0x1c, 0xc000f0fbec, 0x7, 0xffffffffffffffff, 0x0, 0x0, 0x0, 0x0, 0x0)
/usr/local/go/src/syscall/asm_linux_amd64.s:44 +0x5
k8s.io/kubernetes/vendor/golang.org/x/sys/unix.EpollWait(0x1c, 0xc000f0fbec, 0x7, 0x7, 0xffffffffffffffff, 0x0, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/sys/unix/zsyscall_linux_amd64.go:1760 +0x72
k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify.(*fdPoller).wait(0xc000a4cce0, 0x0, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify/inotify_poller.go:86 +0x91
k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify.(*Watcher).readEvents(0xc0002b3b30)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify/inotify.go:192 +0x1f8
created by k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify.NewWatcher
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify/inotify.go:59 +0x1a5
goroutine 198 [IO wait]:
internal/poll.runtime_pollWait(0x7fe5b97fbe98, 0x72, 0xffffffffffffffff)
/usr/local/go/src/runtime/netpoll.go:184 +0x55
internal/poll.(*pollDesc).wait(0xc0008cec98, 0x72, 0x1000, 0x1000, 0xffffffffffffffff)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45
internal/poll.(*pollDesc).waitRead(...)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0xc0008cec80, 0xc0008f8000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:169 +0x1cf
net.(*netFD).Read(0xc0008cec80, 0xc0008f8000, 0x1000, 0x1000, 0x434cdc, 0xc0000c9b20, 0x45f740)
/usr/local/go/src/net/fd_unix.go:202 +0x4f
net.(*conn).Read(0xc00017b280, 0xc0008f8000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/usr/local/go/src/net/net.go:184 +0x68
net/http.(*persistConn).Read(0xc0003897a0, 0xc0008f8000, 0x1000, 0x1000, 0xc0001d62a0, 0xc0000c9c20, 0x408475)
/usr/local/go/src/net/http/transport.go:1758 +0x75
bufio.(*Reader).fill(0xc0008f6000)
/usr/local/go/src/bufio/bufio.go:100 +0x103
bufio.(*Reader).Peek(0xc0008f6000, 0x1, 0x0, 0x0, 0x1, 0xc0024df100, 0x0)
/usr/local/go/src/bufio/bufio.go:138 +0x4f
net/http.(*persistConn).readLoop(0xc0003897a0)
/usr/local/go/src/net/http/transport.go:1911 +0x1d6
created by net/http.(*Transport).dialConn
/usr/local/go/src/net/http/transport.go:1580 +0xb0d
goroutine 199 [select]:
net/http.(*persistConn).writeLoop(0xc0003897a0)
/usr/local/go/src/net/http/transport.go:2210 +0x123
created by net/http.(*Transport).dialConn
/usr/local/go/src/net/http/transport.go:1581 +0xb32
goroutine 144 [chan receive]:
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch.(*Broadcaster).loop(0xc0007ce200)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch/mux.go:207 +0x66
created by k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch.NewBroadcaster
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch/mux.go:75 +0xcc
goroutine 145 [chan receive]:
k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record.(*eventBroadcasterImpl).StartEventWatcher.func1(0x4ac0c80, 0xc0006bf0b0, 0xc0002a7330)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record/event.go:288 +0xc5
created by k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record.(*eventBroadcasterImpl).StartEventWatcher
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record/event.go:286 +0x6e
goroutine 210 [chan receive]:
k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record.(*eventBroadcasterImpl).StartEventWatcher.func1(0x4ac0c80, 0xc0006bf260, 0xc0006bf230)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record/event.go:288 +0xc5
created by k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record.(*eventBroadcasterImpl).StartEventWatcher
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record/event.go:286 +0x6e
goroutine 224 [IO wait]:
internal/poll.runtime_pollWait(0x7fe5b97fb8e8, 0x72, 0xffffffffffffffff)
/usr/local/go/src/runtime/netpoll.go:184 +0x55
internal/poll.(*pollDesc).wait(0xc0003f2398, 0x72, 0x8000, 0x8000, 0xffffffffffffffff)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45
internal/poll.(*pollDesc).waitRead(...)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0xc0003f2380, 0xc0009e0000, 0x8000, 0x8000, 0x0, 0x0, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:169 +0x1cf
net.(*netFD).Read(0xc0003f2380, 0xc0009e0000, 0x8000, 0x8000, 0x0, 0x800010601, 0x0)
/usr/local/go/src/net/fd_unix.go:202 +0x4f
net.(*conn).Read(0xc0009de000, 0xc0009e0000, 0x8000, 0x8000, 0x0, 0x0, 0x0)
/usr/local/go/src/net/net.go:184 +0x68
bufio.(*Reader).Read(0xc0008de060, 0xc000b2e038, 0x9, 0x9, 0xc001472000, 0x7fe5b98cf460, 0x0)
/usr/local/go/src/bufio/bufio.go:226 +0x26a
io.ReadAtLeast(0x4aa8080, 0xc0008de060, 0xc000b2e038, 0x9, 0x9, 0x9, 0xa63185, 0xc00203fe8c, 0xc000cecde8)
/usr/local/go/src/io/io.go:310 +0x87
io.ReadFull(...)
/usr/local/go/src/io/io.go:329
k8s.io/kubernetes/vendor/golang.org/x/net/http2.readFrameHeader(0xc000b2e038, 0x9, 0x9, 0x4aa8080, 0xc0008de060, 0x0, 0xbfa4652c00000000, 0x305931a7136, 0x701b4a0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/frame.go:237 +0x87
k8s.io/kubernetes/vendor/golang.org/x/net/http2.(*Framer).ReadFrame(0xc000b2e000, 0xc00203fe80, 0xc00203fe80, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/frame.go:492 +0xa1
k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport.(*http2Client).reader(0xc000050c40)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport/http2_client.go:1264 +0x183
created by k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport.newHTTP2Client
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport/http2_client.go:300 +0xd23
goroutine 21908 [select, 13 minutes]:
k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch.func2(0xc001a50820, 0xc000ad1920, 0xc0008c03c0, 0xc0018d74a0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache/reflector.go:328 +0x178
created by k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache/reflector.go:322 +0x28f
goroutine 220 [IO wait, 55 minutes]:
internal/poll.runtime_pollWait(0x7fe5b97fbc28, 0x72, 0x0)
/usr/local/go/src/runtime/netpoll.go:184 +0x55
internal/poll.(*pollDesc).wait(0xc000aef398, 0x72, 0x0, 0x0, 0x4311d43)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45
internal/poll.(*pollDesc).waitRead(...)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0xc000aef380, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:384 +0x1f8
net.(*netFD).accept(0xc000aef380, 0xc000a46008, 0x0, 0x0)
/usr/local/go/src/net/fd_unix.go:238 +0x42
net.(*UnixListener).accept(0xc000be6720, 0xc000cf1e00, 0xc000cf1e08, 0x18)
/usr/local/go/src/net/unixsock_posix.go:162 +0x32
net.(*UnixListener).Accept(0xc000be6720, 0x44abe70, 0xc000af4c00, 0x4b7d700, 0xc000a46008)
/usr/local/go/src/net/unixsock.go:260 +0x47
k8s.io/kubernetes/vendor/google.golang.org/grpc.(*Server).Serve(0xc000af4c00, 0x4b1d900, 0xc000be6720, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/server.go:597 +0x22e
k8s.io/kubernetes/pkg/kubelet/dockershim/remote.(*DockerServer).Start.func1(0xc000be66c0, 0x4b1d900, 0xc000be6720)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/dockershim/remote/docker_server.go:72 +0x47
created by k8s.io/kubernetes/pkg/kubelet/dockershim/remote.(*DockerServer).Start
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/dockershim/remote/docker_server.go:71 +0x343
goroutine 219 [select]:
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000bde840, 0x4aadcc0, 0xc000c42000, 0xc000480001, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x13f
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000bde840, 0x45d964b800, 0x0, 0xc00049cf01, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0xe2
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(0xc000bde840, 0x45d964b800, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90 +0x4d
created by k8s.io/kubernetes/pkg/kubelet/dockershim/cm.(*containerManager).Start
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/dockershim/cm/container_manager_linux.go:83 +0xa0
goroutine 200 [syscall, 55 minutes]:
syscall.Syscall6(0xe8, 0x12, 0xc000d31bec, 0x7, 0xffffffffffffffff, 0x0, 0x0, 0x0, 0x0, 0x0)
/usr/local/go/src/syscall/asm_linux_amd64.s:44 +0x5
k8s.io/kubernetes/vendor/golang.org/x/sys/unix.EpollWait(0x12, 0xc000d31bec, 0x7, 0x7, 0xffffffffffffffff, 0x0, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/sys/unix/zsyscall_linux_amd64.go:1760 +0x72
k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify.(*fdPoller).wait(0xc000a4c020, 0x0, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify/inotify_poller.go:86 +0x91
k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify.(*Watcher).readEvents(0xc0002b2050)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify/inotify.go:192 +0x1f8
created by k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify.NewWatcher
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/fsnotify/fsnotify/inotify.go:59 +0x1a5
goroutine 181 [select]:
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0006a6560, 0x4aadcc0, 0xc000be6120, 0xc000159301, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x13f
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0006a6560, 0x12a05f200, 0x0, 0x1, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0xe2
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(...)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Forever(0xc0006a6560, 0x12a05f200)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:81 +0x4f
created by k8s.io/kubernetes/pkg/kubelet/dockershim/network/cni.(*cniNetworkPlugin).Init
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/dockershim/network/cni/cni.go:229 +0x112
goroutine 183 [select]:
k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport.(*controlBuffer).get(0xc000aac050, 0x1, 0x0, 0x0, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport/controlbuf.go:395 +0x122
k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport.(*loopyWriter).run(0xc0008ec240, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport/controlbuf.go:513 +0x1e3
k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport.newHTTP2Server.func2(0xc000ac2000)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport/http2_server.go:296 +0xcb
created by k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport.newHTTP2Server
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/internal/transport/http2_server.go:293 +0x105d
goroutine 22195 [select]:
k8s.io/kubernetes/vendor/github.com/google/cadvisor/container/common.(*realFsHandler).trackUsage(0xc000958090)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/google/cadvisor/container/common/fsHandler.go:112 +0xf9
created by k8s.io/kubernetes/vendor/github.com/google/cadvisor/container/common.(*realFsHandler).Start
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/google/cadvisor/container/common/fsHandler.go:139 +0x3f
goroutine 227 [select, 55 minutes]:
k8s.io/kubernetes/vendor/google.golang.org/grpc.(*ccBalancerWrapper).watcher(0xc000cb4400)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/balancer_conn_wrappers.go:69 +0xc2
created by k8s.io/kubernetes/vendor/google.golang.org/grpc.newCCBalancerWrapper
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/balancer_conn_wrappers.go:60 +0x16d
goroutine 228 [chan receive, 55 minutes]:
k8s.io/kubernetes/vendor/google.golang.org/grpc.(*addrConn).resetTransport(0xc0001fcdc0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/clientconn.go:1164 +0x6ea
created by k8s.io/kubernetes/vendor/google.golang.org/grpc.(*addrConn).connect
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/clientconn.go:800 +0x128
goroutine 229 [select, 55 minutes]:
k8s.io/kubernetes/vendor/google.golang.org/grpc.(*ccBalancerWrapper).watcher(0xc000cb4580)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/balancer_conn_wrappers.go:69 +0xc2
created by k8s.io/kubernetes/vendor/google.golang.org/grpc.newCCBalancerWrapper
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/balancer_conn_wrappers.go:60 +0x16d
goroutine 230 [chan receive, 55 minutes]:
k8s.io/kubernetes/vendor/google.golang.org/grpc.(*addrConn).resetTransport(0xc0001fd080)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/clientconn.go:1164 +0x6ea
created by k8s.io/kubernetes/vendor/google.golang.org/grpc.(*addrConn).connect
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/google.golang.org/grpc/clientconn.go:800 +0x128
goroutine 231 [chan receive]:
k8s.io/kubernetes/pkg/util/config.(*Mux).listen(0xc0006bf7a0, 0x430d4e2, 0x4, 0xc0001d73e0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/config/config.go:82 +0x89
k8s.io/kubernetes/pkg/util/config.(*Mux).Channel.func1()
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/config/config.go:77 +0x45
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0006bf860)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155 +0x5e
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0006bf860, 0x4aadcc0, 0xc000856000, 0xc00093e001, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156 +0xa3
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0006bf860, 0x0, 0x0, 0xc00049cf01, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0xe2
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(0xc0006bf860, 0x0, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90 +0x4d
created by k8s.io/kubernetes/pkg/util/config.(*Mux).Channel
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/config/config.go:77 +0x1ea
goroutine 232 [select]:
k8s.io/kubernetes/pkg/kubelet/config.(*sourceFile).run.func1(0xc0004c6690, 0xc0004c66e0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/config/file.go:101 +0xf6
created by k8s.io/kubernetes/pkg/kubelet/config.(*sourceFile).run
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/config/file.go:95 +0x5b
goroutine 233 [select, 55 minutes]:
k8s.io/kubernetes/pkg/kubelet/config.(*sourceFile).doWatch(0xc0004c6690, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/config/file_linux.go:91 +0x3f8
k8s.io/kubernetes/pkg/kubelet/config.(*sourceFile).startWatch.func1()
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/config/file_linux.go:59 +0xb0
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0006bf9e0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155 +0x5e
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0006bf9e0, 0x4aadcc0, 0xc000792030, 0xc00017a201, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156 +0xa3
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0006bf9e0, 0x3b9aca00, 0x0, 0x1, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0xe2
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(...)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Forever(0xc0006bf9e0, 0x3b9aca00)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:81 +0x4f
created by k8s.io/kubernetes/pkg/kubelet/config.(*sourceFile).startWatch
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/config/file_linux.go:54 +0x113
goroutine 234 [chan receive]:
k8s.io/kubernetes/pkg/util/config.(*Mux).listen(0xc0006bf7a0, 0x430c871, 0x3, 0xc0001d7440)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/config/config.go:82 +0x89
k8s.io/kubernetes/pkg/util/config.(*Mux).Channel.func1()
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/config/config.go:77 +0x45
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0006bfa10)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155 +0x5e
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0006bfa10, 0x4aadcc0, 0xc000846000, 0xc000159201, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156 +0xa3
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0006bfa10, 0x0, 0x0, 0xc0000b5f01, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0xe2
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(0xc0006bfa10, 0x0, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90 +0x4d
created by k8s.io/kubernetes/pkg/util/config.(*Mux).Channel
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/util/config/config.go:77 +0x1ea
goroutine 235 [select]:
k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache.(*Reflector).watchHandler(0xc00089b5f0, 0xbfa46514f559e302, 0x2f008a85a2a, 0x701b4a0, 0x4ac0c40, 0xc000e19d00, 0xc000b45b98, 0xc000c26ba0, 0xc0000fa0c0, 0x0, ...)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache/reflector.go:431 +0x1ab
k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch(0xc00089b5f0, 0xc0000fa0c0, 0x0, 0x0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache/reflector.go:395 +0xaeb
k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache.(*Reflector).Run.func1()
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache/reflector.go:177 +0x33
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc000c2df48)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155 +0x5e
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000b45f48, 0x4aadca0, 0xc0004c6780, 0x1, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156 +0xa3
k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache.(*Reflector).Run(0xc00089b5f0, 0xc0000fa0c0)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache/reflector.go:176 +0x17e
created by k8s.io/kubernetes/pkg/kubelet/config.newSourceApiserverFromLW
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47 +0x1df
goroutine 374 [chan receive, 55 minutes]:
k8s.io/kubernetes/vendor/k8s.io/client-go/tools/cache.(*sharedProcessor).run(0xc0007d1650, 0xc000dfc900)
/workspace/anago-v1.18.2-beta.0.14+a78cd082e8c913/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment