Yesterday evening our Jenkins Dev Build job started failing whenever it tried to transfer the artifacts to Dev node stating the reason – “No space left on device” as follows:
./dev-deployment/scripts/dev-node-deploy.sh mkdir: cannot create directory `/apps/jboss/dev-deployment/backup/28November2016_15:07:22': No space left on device
Failure message itself made us understand that there was a memory issue on the Dev node which was making the build job to fail. So we quickly logged into the machine and started diagnosing it following below command:
$ df -h
Above command lists the information regarding Filesystem along with total size, used and available of memory as below:
Filesystem Size Used Avail Use% Mounted on /dev/mapper/VG00-root 2.0G 510M 1.4G 28% / tmpfs 16G 12K 16G 1% /dev/shm/dev/sda1 190M 86M 94M 48% /boot /dev/mapper/VG00-home 4.8G 11M 4.6G 1% /home /dev/mapper/VG00-opt 20G 313M 19G 2% /opt /dev/mapper/VG00-tmp 4.8G 110M 4.5G 3% /tmp /dev/mapper/VG00-usr 9.8G 1.6G 7.7G 18% /usr /dev/mapper/VG00-var 4.8G 857M 3.8G 19% /var /dev/mapper/VG_APPS-apps 36G 36G 17M 100% /apps /dev/mapper/nb_vg-lv_openv 9.8G 3.3G 6.0G 36% /usr/openv
Looking over the result made us understand that /dev/mapper/VG_APPS-apps is mounted with 36 GB which is totally occupied and eventually making our Jenkins job fail.
Provided information was not enough to know the exact place which was occupied, in order to find out we further diagnosed it following below command:
$ du -h /apps/ --max-depth=2
which listed below result:
300K /apps/jboss/.activator 4.9G /apps/jboss/jboss-eap-6.4.0 171M /apps/jboss/.sbt 16K /apps/jboss/.fontconfig 8.0K /apps/jboss/.vim 12K /apps/jboss/.oracle_jre 120M /apps/jboss/tools 353M /apps/jboss/jdk1.8.0_71 26GB /apps/jboss/dev-deployment 48K /apps/jboss/.jmc 8.0K /apps/jboss/.ssh 298M /apps/jboss/.ivy2 8.4G /apps/jboss
Looking over the result again we got an idea that there was something inside /apps/jboss/dev-deployment which occupied almost all of the space mounted to the directory. Getting more deeper
$ cd /apps/jboss/dev-deployment/ $ du -h --max-depth=1
Apparently helped to know that it was a log folder which occupied the complete directory size
32K ./data 8.0K ./lib 600K ./configuration 577M ./tmp 26GB ./log 297M ./deployments 945M .
We deleted unwanted logs from log directory and restarted Jenkins Dev Build job assuming it should work but it again failed stating the same reason.
We ran the same commands again this time with more curiosity:
$ df -h /apps/
Apparently result showed us the disk was still full:
Filesystem Size Used Avail Use% Mounted on /dev/mapper/VG_APPS-apps 36G 36G 17M 100% /apps
Whereas checking the disk space used by files
$ du -h /apps/
showed only 8.4GB is consumed:
300K /apps/jboss/.activator 4.9G /apps/jboss/jboss-eap-6.4.0 171M /apps/jboss/.sbt 16K /apps/jboss/.fontconfig 8.0K /apps/jboss/.vim 12K /apps/jboss/.oracle_jre 120M /apps/jboss/tools 353M /apps/jboss/jdk1.8.0_71 200M /apps/jboss/dev-deployment 48K /apps/jboss/.jmc 8.0K /apps/jboss/.ssh 298M /apps/jboss/.ivy2 8.4G /apps/jboss
These results makes us worried where the space was occupied, then we came across this answer on Stackoverflow.com which says there might be chances that file is still opened by some process.
And as suggested in the solution, we restarted Dev node along with Jenkins Dev Build job and were able to turn our build job “GREEN”.