I am trying to run one ansible playbook for deploying kubernetes cluster using the tool kubespray on ubuntu 16.04 OS. I have one base machine which is installed with ansible and cloned kubespray git repository. And one master and two worker nodes containing in cluster.
My host (Updated) file like the followig screenshot,
[all]
MILDEVKUB020 ansible_ssh_host=MILDEVKUB020 ip=192.168.16.173 ansible_user=uName ansible_ssh_pass=pwd
MILDEVKUB030 ansible_ssh_host=MILDEVKUB030 ip=192.168.16.176 ansible_user=uName ansible_ssh_pass=pwd
MILDEVKUB040 ansible_ssh_host=MILDEVKUB040 ip=192.168.16.177 ansible_user=uName ansible_ssh_pass=pwd
[kube-master]
MILDEVKUB020
[etcd]
MILDEVKUB020
[kube-node]
MILDEVKUB020
MILDEVKUB030
MILDEVKUB040
[k8s-cluster:children]
kube-master
kube-node
Location of hosts.ini file is /inventory/sample. And I am trying the following ansible command
sudo ansible-playbook -i inventory/sample/hosts.ini cluster.yml --user=uName --extra-vars "ansible_sudo_pass=pwd"
And I am using the playbook "cluster.yml" from the following link
https://github.com/kubernetes-sigs/kubespray/blob/master/cluster.yml
And my /etc/hosts file containing the entries ,
127.0.0.1 MILDEVDCR01.Milletech.us MILDEVDCR01
192.168.16.173 MILDEVKUB020.Milletech.us MILDEVKUB020
192.168.16.176 MILDEVKUB030.Milletech.us MILDEVKUB030
192.168.16.177 MILDEVKUB040.Milletech.us MILDEVKUB040
Updated error
TASK [adduser : User | Create User Group]
Thursday 04 April 2019 11:34:55 -0400 (0:00:00.508) 0:00:33.383 ********
fatal: [MILDEVKUB040]: FAILED! => {"changed": false, "msg": "groupadd: Permission denied.\ngroupadd: cannot lock /etc/group; try again later.\n", "name": "kube-cert"}
fatal: [MILDEVKUB020]: FAILED! => {"changed": false, "msg": "groupadd: Permission denied.\ngroupadd: cannot lock /etc/group; try again later.\n", "name": "kube-cert"}
fatal: [MILDEVKUB030]: FAILED! => {"changed": false, "msg": "groupadd: Permission denied.\ngroupadd: cannot lock /etc/group; try again later.\n", "name": "kube-cert"}
I am getting error like this evenif I am able to connect all machine from base machine using ssh. Can anyone help me to trace out what is my issue for running this command to deploy kubernetes cluster please?
If you removed passphrase, ssh connection should be ok now. Have you updated ssh keys on remote hosts after your changes?
After trying lot of research I found that need to put the parameters "--ask-pass --become --ask-become-pass" when we are running the ansible playbook. I tried the following command,
sudo ansible-playbook -i inventory/sample/hosts.ini cluster.yml --user=docker --ask-pass --become --ask-become-pass
And , when it is continuing with kubernetes cluster deployment , it will again arise the problem of inventory name need to use only with small letteres. So I edited all the inventory name and etc/hostname and /etc/hosts with small case hostname . And also putted all small letters in inventory file. Now its working successfully.
the /etc/hosts contain like following,
127.0.0.1 MILDEVDCR01.Milletech.us mildevdcr01
192.168.16.173 MILDEVKUB020.Milletech.us mildevkub020
192.168.16.176 MILDEVKUB030.Milletech.us mildevkub030
192.168.16.177 MILDEVKUB040.Milletech.us mildevkub040
etc/hostname
mildevdcr01
And hosts.ini file like the following,
[all]
mildevkub020 ansible_ssh_host=mildevkub020 ip=192.168.16.173 ansible_user=uName
ansible_ssh_pass=pwd
mildevkub030 ansible_ssh_host=mildevkub030 ip=192.168.16.176 ansible_user=uName
ansible_ssh_pass=pwd
mildevkub040 ansible_ssh_host=mildevkub040 ip=192.168.16.177 ansible_user=uName
ansible_ssh_pass=pwd
[kube-master]
mildevkub020
[etcd]
mildevkub020
[kube-node]
mildevkub020
mildevkub030
mildevkub040
[k8s-cluster:children]
kube-master
kube-node
It we are doing like this, we will get the deployed Kubernetes cluster on destination host machines.
If you are using user/password combination to login. The user with which ansible is getting executed should be present in the sudoers file to switch to root or another other privileged user
Check the sudoers and try to manually do a sudo su root on the target server
You may need to specify ssh user or key
ansible_ssh_user=<USERNAME>
ansible_ssh_pass=<PASSWORD>
if not - share the ssh command that is working.