Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Need "ip" utlity check in Kubespray Project #11679

Open
a7lan opened this issue Oct 31, 2024 · 3 comments
Open

Need "ip" utlity check in Kubespray Project #11679

a7lan opened this issue Oct 31, 2024 · 3 comments
Labels
kind/bug Categorizes issue or PR as related to a bug. triage/accepted Indicates an issue or PR is ready to be actively worked on.

Comments

@a7lan
Copy link

a7lan commented Oct 31, 2024

What happened?

Kubespray fails during deployment if the ip utility from the iproute2 package is missing on any node. This causes the [kubespray-defaults : Create fallback_ips_base] role to encounter a fatal error, stopping the deployment process.

What did you expect to happen?

I expected Kubespray to first check for the presence of the iproute2 package before executing any tasks that depend on it. If the package is missing, Kubespray should automatically install it to prevent errors during the deployment process.

How can we reproduce it (as minimally and precisely as possible)?

Set up an inventory for a Kubespray deployment where some nodes lack the iproute2 package.

For example, configure 3 control-plane nodes with iproute2 installed and 3 worker nodes without it.

Run the Kubespray playbook targeting this inventory.
Observe that the deployment will fail with a fatal error during the [kubespray-defaults : Create fallback_ips_base] role due to the missing ip utility on worker nodes.

OS

DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=22.04
DISTRIB_CODENAME=jammy
DISTRIB_DESCRIPTION="Ubuntu 22.04.3 LTS"

Version of Ansible

ansible [core 2.16.12]
config file = /home/aslan/kubespray-venv/kubespray/ansible.cfg
configured module search path = ['/home/aslan/kubespray-venv/kubespray/library']
ansible python module location = /home/aslan/kubespray-venv/venv/lib/python3.12/site-packages/ansible
ansible collection location = /home/aslan/.ansible/collections:/usr/share/ansible/collections
executable location = /home/aslan/kubespray-venv/venv/bin/ansible
python version = 3.12.3 (main, Sep 11 2024, 14:17:37) [GCC 13.2.0] (/home/aslan/kubespray-venv/venv/bin/python3.12)
jinja version = 3.1.4
libyaml = True

Version of Python

Python 3.12.3

Version of Kubespray (commit)

f9ebd45

Network plugin used

calico

Full inventory with variables

https://gist.github.com/a7lan/0da098dc33cee26eac893a694e50afa9

Command used to invoke ansible

ansible-playbook playbooks/upgrade_cluster.yml -i inventory/dev/inventory.ini -b -e kube_version=v1.29.9 --limit worker01

Output of ansible run

https://gist.github.com/a7lan/69ff01801613e4a1b9b7a2f3c14fed5f

Anything else we need to know

No response

@a7lan a7lan added the kind/bug Categorizes issue or PR as related to a bug. label Oct 31, 2024
@VannTen
Copy link
Contributor

VannTen commented Oct 31, 2024

I don't see why you think this is related to iproute2 ?

Have you run the facts.yml playbook before using --limit ? Kubespray relies on fact cache for this.
See #11598 and #11587

@a7lan
Copy link
Author

a7lan commented Oct 31, 2024

I installed the iproute2 package on the nodes that initially lacked it, and after this, the Kubespray playbook executed successfully without issues.

Regarding fact caching, here are two sample outputs from running ansible -m setup with the filter=ansible_default_ipv4 option on nodes both with and without the iproute2 package installed. The node worker01 does not have iproute2, resulting in empty ansible_facts, while worker02 with iproute2 provides the expected network information:

Without iproute2:

$ ansible worker01 -m setup -a "filter=ansible_default_ipv4" -i inventory/dev/inventory.ini
worker01 | SUCCESS => {
    "ansible_facts": {},
    "changed": false
}

With iproute2:


$ ansible worker02 -m setup -a "filter=ansible_default_ipv4" -i inventory/dev/inventory.ini
worker02 | SUCCESS => {
    "ansible_facts": {
        "ansible_default_ipv4": {
            "address": "172.20.98.95",
            "alias": "ens18",
            "broadcast": "172.20.99.255",
            "gateway": "172.20.98.1",
            "interface": "ens18",
            "macaddress": "a6:03:57:11:8f:49",
            "mtu": 1500,
            "netmask": "255.255.254.0",
            "network": "172.20.98.0",
            "prefix": "23",
            "type": "ether"
        }
    },
    "changed": false
}

@VannTen
Copy link
Contributor

VannTen commented Oct 31, 2024 via email

@k8s-ci-robot k8s-ci-robot added the triage/accepted Indicates an issue or PR is ready to be actively worked on. label Oct 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug. triage/accepted Indicates an issue or PR is ready to be actively worked on.
Projects
None yet
Development

No branches or pull requests

3 participants