flanneld not functioning with base kubernetes install | centos | virtualbox

1/4/2018

Thanks for the read.

I'd like to preface this with the fact that I am weak on dev-ops! :)

First off, my environment: 3 node cluster each of which are CentOS 7 VMs running in VirtualBox 5.1.3. I have an additional network adaptor enabled running an internal network visible only to the VMs.

I have configured the VMs to use the additional adaptor and am able to ping all nodes via it's network.

etcd config:

{ "Network": "172.30.0.0/24", "Backend": { "Type": "vxlan" } }

nmcli output:

enp0s3: connected to enp0s3
    "Intel 82540EM Gigabit Ethernet Controller (PRO/1000 MT Desktop Adapter)"
    ethernet (e1000), 08:00:27:DD:CB:CA, hw, mtu 1500
    ip4 default
    inet4 10.0.2.15/24
    inet6 fe80::49e4:5aa5:65c8:6e48/64

enp0s8: connected to enp0s8
    "Intel 82540EM Gigabit Ethernet Controller (PRO/1000 MT Desktop Adapter)"
    ethernet (e1000), 08:00:27:40:11:AE, hw, mtu 1500
    inet4 192.168.0.10/24
    inet6 fe80::90cc:f3b4:7e73:cf3f/64

I am working through the installation/configuration instructions in the following documentation.

I wanted to run through this full installation to grasp the overall topology if the product.

Installation seems to be fine and I am able to deploy pods to the cluster. The issue is that flanneld does not seem to be working. I have it explicitly bound to the internal network adapter in the config (enp0s8), however it's showing:

flannel.1: disconnected
    "flannel.1"
    vxlan, 0A:91:01:BC:7D:7D, sw, mtu 1450

Any insight on this issue would be appreciated. I have done due diligence before reaching out to the community!

Thanks in advance.

Braden

-- Braden
centos
flannel
kubernetes
linux

0 Answers