Jenkins jobs now fail after upgrading Jenkins kubernetes plugin from 1.14.2 --> 1.26.0. Plugin update changes build dir path- re-clarifying question

6/9/2020

Jenkins version is 2.222.4. We upgraded the jenkins kubernetes plugin from 1.14.2 --> 1.26.0. What this has done is pre-pluginupgrade, the jenkins slave would mount /home/jenkins as rw so it could use .gradle files in there for its build.

Post plugin upgrade, home/jenkins is now change to readonly, and instead the dir called /home/jenkins/agent has become the read/write. However the build job now has no more r/w access to files in home/jenkins which it needs.

I did a df -h on our slave jnlp pod pre upgrade (k8splugin-V1.14.2) and see the following:

Filesystem Size Used Available Use% Mounted on overlay 119.9G 5.6G 109.1G 5% / /dev/nvme0n1p2 119.9G 5.6G 109.1G 5% /home/jenkins

and can see its mounted as read/write

cat /proc/mounts | grep -i jenkins /dev/nvme0n1p2 /home/jenkins ext4 rw,relatime,data=ordered 0 0

Post plugin upgrade if I run a df -h I don't even see /home/jenkins mounted only: /dev/nvme0n1p2 120G 5.6G 110G 5% /etc/hosts

and if I cat /proc/mounts I only see this post upgrade

jenkins@buildpod:~$ cat /proc/mounts | grep -i jenkins /dev/nvme0n1p2 /home/jenkins/agent ext4 rw,relatime,data=ordered 0 0 /dev/nvme0n1p2 /home/jenkins/.jenkins ext4 rw,relatime,data=ordered 0 0

Also seeing this in the jenkins job log but not sure if it is relevant: WARNING] HOME is set to / in the jnlp container. You may encounter troubles when using tools or ssh client. This usually happens if the uid doesnt have any entry in /etc/passwd. Please add a user to your Dockerfile or set the HOME environment variable to a valid directory in the pod template definition.

Any ideas or workarounds would be most welcome as badly stuck by this issue. Brian

-- b Od
docker
jenkins
jenkins-plugins
kubernetes

1 Answer

6/12/2020

My colleague just figured this out. He found it goes back to a change the plugin developers made sometime in August 2019, to be compatible with Kubernetes 1.18. That's when they changed the default workspace in release 1.18.0 of the plugin. It was spotted and supposed to be fixed here github.com/jenkinsci/kubernetes-plugin/pull/713 but it persists in our case. Workaround is to hardcode into the jenkinsfile of each job workingDir: '/home/jenkins' under the container

-- b Od
Source: StackOverflow